Saturday, November 29, 2008
The audio article posted combines my entries Concurrency in Game Development and Why Do We Use C++?
Industry Broadcast is the result of Ryan's great idea to offer insight from game developers in audio format. He says -
I was toiling over one disappointing aspect of almost every one of my days: That being that I have so little time outside of my 16-20 hour work day to be a human being that I never am able to sit down and read enough of the articles being published online about the games industry. Occasionally my RSS feeder throws something at me that I just have to make time for but for the most part I see the headlines of features and articles and have to regretfully click away and focus on the 65 new emails sitting in my mail box. I thought ‘God, if I could just have an LCD in the shower with me I might actually have the ability to digest the plethora of useful information regularly being shared by the top minds in our industry.” And then an even better idea struck me: If all of this material was available in Audio Format then I, as well as fellow developers, could be listening to these amazing articles whilst working out or doing the dishes, or or or.
Ryan also maintains a personal blog here.
Sunday, November 16, 2008
In researching this post, I went through my journals from that time period to try to figure out exactly how Undead failed. One glaring issue is that ~75% of the journal was about girls, with the remainder divided among writing music, painting classes and working on the game. While totally expected for a distractible teen, given the difficulty of the task and a rapidly approaching deadline (discussed later) it's not a good indicator of eventual success of the project. Nevertheless, software development is a human activity and must coexist with other human activities. Initially, I was more interested in producing an Ultima-esque demo that would get me a job at Origin Systems.
I met a classmate in my speech class at school named Alex Kapadia. Alex had some experience with sound programming and was a game fanatic. I showed him what I'd built, he was thrilled and we began collaborating. At the time, we were playing a game called Solar Winds distributed by a small shareware company called Epic MegaGames before it shortened its name and became known as the world-class developer of Unreal and Gears of War. Each time the game ended, up popped an ad by Epic requesting developers to send in their games for publishing. We sent Undead to Epic and awaited a response. We didn't have to wait long. In early summer 1993 we knew that we would be working with Epic. We sent several versions to Epic over the following year. As it started to gel a bit, we began working more closely with them.
JOURNAL 12/06/93: Talked to Tim Sweeney (president of Epic MegaGames) today. Wow, he knows games. He gave me a lot of good suggestions. He's looking into an artist and musician for the game. He was talking some intense MONEY!!!!!!!!!!
Tim estimated that the game would make Alex and I $40,000, which was an unfathomable amount of money for me as a 19-year old.
JOURNAL 12/08/93: Today Dan Froelich, musician and composer for Epic called me. He lives in Kansas City (that's crazy!) and works for Informix. He told me a lot about how the game publishing thing works. After finals week, we'll get together.
We did end up getting together, Dan, Alex and I at Manny's, the finest Mexican restaurant in Kansas City. He told us a lot about Epic and the business of developing shareware games. He also brought beta copies of Epic's Xargon and Ken's Labyrinth, developed by wunderkind Ken Silverman.
From Tim Sweeney (via E-mail):
Date: Wed, 15 Dec 1993 04:28:24 -0500 (EST)
From: Tim Sweeney <70451 .633=".633" compuserve.com="compuserve.com">
Subject: RE: Undead & Pinball
Message-id: <931215092823_70451 .633_fhg40-1=".633_fhg40-1" compuserve.com="compuserve.com">
Thanks! It's great that you and Dan live in the same town. I hope you'll be interested in working with Dan - he's been with us on Jill, Kiloblaster, Brix, Solar Winds, Adventure Math, and Xargon and his experience working with these projects would be helpful in addition to his music! I'm looking forward to seeing more of Undead! It's the best "first" game any author has shown us. Judging by your coding and artwork, your game has great potential and you have tremendous potential as an author. Undead still has a long way to go in developing a story, creating all the artwork, and turning it into a fun, unique, and successful shareware game, but it looks like you have the perserverence to see it through to the end. Keep up the great work!
-Tim, Epic MegaGames
(3:22 AM - normal business hours for we game developers) :-)
A House of Cards
In late 1993, we believed Undead was going to be an enormous hit. The possibilities appeared endless. Even so, I was starting to deeply fear that I wouldn't be able to finish it. Undead had grown in complexity so much that modifying it was becoming difficult. It was approaching 20,000 lines of C and assembly and because it had no higher level architecture to speak of, I was losing my ability to understand it. Disparate pieces were tightly coupled. There was no layering of subsystems. Significant numerical values were hard-coded everywhere. Every line of code I added made it harder to work with. It was like going from troweling wet concrete to chiseling hardened concrete. This was a classic case of underengineering and I was unequipped to fix it. I was scared. Instead of confronting that issue, I began obsessing over the art - most of the development time was now spent in DeluxePaint getting the art just right.
JOURNAL 03/02/94: Undead is gaining bugs as I try to fix it! Argh. And it's having memory problems -- specifically, not enough static memory.
Worse yet, I had a deadline that I hadn't yet shared with Epic. In a year, more or less, I'd be leaving the country for two years without access to a computer. Now, I suspect I could have fixed Undead's woes given enough time, but I didn't have that time.
Man on a Mission
I grew up in the Mormon (LDS) church. Although I left the church many years ago now, at the time I was devout and determined (and expected) to become a missionary (I'll write sometime later about why this was the right choice, despite my eventual leaving). Around March 1994, I called Tim and told him that I had committed to leave for a two-year mission to central Mexico sometime around September. Understandably, he wasn't happy. I'd naively assumed (wishful thinking) that I would be able to get everything done before leaving in the fall. He knew better.
JOURNAL 03/26/94: BAD NEWS. Tim Sweeney got the copy of Undead. He told me he thought it was fantastic, but that there was no way on earth I was going to get it done before August. In addition, he said it would be absolutely obsolete when I got back. 03/27/94: I didn't realize how upset I was about what Tim said about Undead not being done until I woke this morning and I had been having nightmares about it all night. Of course he's right and I knew it anyway.
Obsolescence was always on our tail and we felt its pressure even during development. The style of game that Undead represented was an Ultima VI class game when the RPG state of the art was Ultima VII and in 1993, Doom (as shareware no less!) splayed the writing on the wall for all 2D games.
Perhaps, as a project, this was doomed from the outset for many reasons. The most obvious was my leaving before it was done. Compared to what I build today, though, the game was not at all complex. It should have been straightforward to build it in a year-and-a-half. It's been said that programming ability manifests at a young age, while skill in software architecture comes much later. In the end, I knew the dark secret. It wasn't merely unfinished, it was unfinishable.
Saturday, November 15, 2008
Connecting with old friends on Facebook lately led me to remember those early days as a teenager writing video games. It's incredible how much has changed since then. It was nearly impossible to get any information on how to program a computer, much less on how to write games. The library books were typically user and application-oriented. C and assembly books could only be had at the bookstore at ~$60 a piece. In the days before bookstores came with sitting/reading furniture, being a kid without a lot of cash, I'd go to Waldenbooks in the mall and stand there reading, absorbing as much as possible before I had to leave. Today, on the other hand, there's rarely a need to buy technical books because whatever I need can be found on the web faster and for free. Today is definitely better.
With a tiny bit of programming in school (Applesoft and Commodore BASIC) when we got our first computer (a Tandy 1000SX), I naturally started with GW-BASIC. It was too slow for the kinds of games I wanted to make - games like Ultima V and Times of Lore.
For Christmas, I asked my parents for a BASIC Compiler (a what?, they asked). It was what I'd circled for them in a magazine - Turbo Basic. I expected to drop my games in and have them run like crazy. There was a hitch, however. The Tandy 1000SX shipped with an OEM version of Microsoft's GW-BASIC that natively supported the Tandy's enhanced CGA video adapter (essentially the same as the the IBM PCjr's with the 16-color 320x200 mode). Turbo Basic didn't. So, as long as the games could be confined to the horrible four-color plain CGA, they'd compile just fine and run pretty fast. I wrote a paint program (Paint!, image below) and a couple of plain (and simple) CGA games (Alien Invasion and BreakOut!, below) in Turbo Basic. In the end, Turbo Basic was still not fast enough and four colors weren't sufficient.
More importantly, there was one specific technical issue that simply could not be solved with any sort of BASIC. I wanted a four-way scrolling world, like those in nearly every commercial game of the time. I was so desperate to make this sort of game that I spent what little money I had on a commercial library (for BASIC) that supported scrolling through assembly-language routines. It turned out that the scrolling supported by the library was simple text scrolling through the BIOS routines, wholly inadequate for what I wanted. I was extremely disappointed and called the (apparently) single-person company for a refund. He refused. There was only possible solution - to learn C.
C had (and still has) a reputation of being extremely difficult, yet it was the language most commercial games were written in. I bought Mix C - a command line compiler/linker and nothing else - for $20. Coming from BASIC, it was a difficult environment. After floundering with that for months and maybe a year, I finally broke down and bought the Turbo C compiler for $150 and spent the next two weeks trying to build a working piece of software in C. Although I didn't fully understand pointers at the end of the two weeks, I wasn't far off from a basic working knowledge of C. Good tools can make all the difference. I still couldn't figure out scrolling until I received a letter from Herman Miller that brought everything together. From that Legacy I (shown above and below) in C finally came to be. It was a collaboration between Eric Lambert (who did the maps and much of the game design) and I. In truth, it wasn't a game in the end - the player is merely able to roam the world in full 16-color Tandy splendor. That was good enough for me because we now had a new computer with 256 (!) colors. It was time to start all over.
Monday, November 03, 2008
Eventually, there is a vast landscape of low-to-zero cost content. Only the best is ever consumed. Cost of creation is enormous and margins are slim. Related services, merchandising, advertising, product placement, endorsement employed to increase profits. Core innovation is low, craft is very high. The technology is mature and is vulnerable to disruptive technologies. More and more is spent against diminishing (though differentiating) improvements in quality.
- Recorded audio becomes a part of movies
- Radio (audio) - a component of television
- Movies and recorded audio - a component of video games
Saturday, September 13, 2008
The Consultant's Law
As directly as possible, ensure that your efforts make money for your clients.
That's the secret to really long contracts and an high level of control over your career. There's a well known evil doppelganger of this law that I'll call The Law of Job Security.
The Law of Job Security
As directly as possible, ensure that your absence will cost your clients money.
A lot of people seem to live by this law. It's subtly different from The Consultant's Law, but The Law of Job Security breeds resentment and comes across as unprofessional and opportunistic. It's also dangerous, because as irrational as it is in some cases, people tend to punish those who take advantage of them - even when it costs them and their company.
Monday, September 08, 2008
James Devlin clearly put an enormous amount of work into this analysis of currently available evaluators. As part of that work, he finished up my 7 card evaluator I posted here a year-and-a-half ago! He offers it as a library with a number of other evaluators including the phenomenal 2+2 evaluator.
And of course, I also really appreciate James' comments :) -
Paul Senzee is well known, in poker-programming circles, for his perfect hash optimization to the original Cactus Kev evaluator and for being an all-around programming BAMF.
Honestly, after that great post and the 2+2 evaluator, I'm not sure if there's any more I could possibly add to the problem of poker hand evaluation.
Although In Other Projects..
Something I've done a little work on recently - At ESPN, Play-by-Play Goes Virtual.
Tuesday, August 05, 2008
Excellent tutorials on Self-Organizing Map theory -
The Self-Organized Gene (Part 1)
The Self-Organized Gene (Part 2)
Intel's Open Computer Vision Library
Recently took a vacation and spent some time in Key West, which was awesome. When we got home I did a little yard work and played around with OpenCV, trivially writing a program to automatically rotate photos by detecting faces and their orientation. Now I'm hooked.
NetFlix renting games has been rumored since its inception. Other services, such as GameFly have stepped in to try to become the NetFlix of games. But while NetFlix is mainstream, I suspect that GameFly's subscriber base is far from it - and it is an additional charge and an additional service if you already have NetFlix. Unlike movie makers, game makers don't have an equivalent to a 'box office' take. So if NetFlix were to ever decide to rent games, while initially great for the consumer, I believe it would profoundly impact game developer's revenue. Which would, in the end, be bad for the consumer given the skyrocketing cost of game development.
So how do we, as game developers, develop a NetFlix-proof business model? Is it all MMOG's from here on out?
Wednesday, July 30, 2008
I don't have a lot of time on my hands, so stuff like this takes quite a long while. Now, there's some messed up stuff going on here. You'll have to forgive me that or wait until it's refined more. For example, scoping isn't right at all. The (let ..) expression isn't how it's supposed to be in a lisp-type language. It defines a variable for the rest of the outer scope which isn't right. I'll fix that. Continuations aren't currently supported, but the underlying machinery is mostly there as well as the mechanism for running multiple lightweight VM threads although this is not accessible from the command line program. Which is probably good, since I'm not sure how well that works with the garbage collector at this point. Also, there's a lot of just general messiness going on. There are a bunch of warnings at compile time and comments are sparse. These things will be fixed eventually. I've compiled these with Visual C++ .NET 2003. There are two parts here.
The Pair Compiler
The Pair VM
See the Code Use Policy if you are interested in using any of this for your own purposes.
Friday, July 18, 2008
The sequel to this phenomenon is the universal second system effect.
Intellectual discovery can be a powerful motivator in design decisions. Take care when tasking a developer with a mundane or boring task. People have a way of making uninteresting problems interesting!
I have more to say about this, including a crazy idea about the SR71 Blackbird, Lisp and the Apollo Missions all somehow appearing before their time (like from the future ;) )..
Wednesday, July 16, 2008
Wednesday, July 09, 2008
The bigger, badder wireless pranking device that evolved from Part I of our pranking bonanza. This is a somewhat technical post. A following post will detail our exploits with it.
The completed device
Learning from the Past
There were a few issues with the first model.
- The photocell was clearly visible on the front of the adapter and that made it look suspicious.
- High maintenance. The contraption didn't work well when it was dark outside. In the interest of simplicity, it was designed to sound when it got too dark. That is, in the normal case, when a shadow fell across it. How dark was too dark was adjustible with a screwdriver. Of course, that meant that Ryan had to turn it off each night or if it rained out. This also made it high maintenance, because it didn't turn off automatically - it would keep sounding as long as the target was casting a shadow over the photocell. This drove Ryan crazy, since he had to listen to it and turn it on and off all the time.
- It was not wireless. Fazeel was eventually able to follow the cord back to the noise box when he figured it out. Then the jig was up.
- It only emitted one single type of noise. Although deliciously annoying, it didn't offer as much flexibility as we'd like.
- The photocell should be better concealed. Bought a Nortel adapter at Skycraft Surplus for $5 apiece. Using a cordless saw, opened it up, removed the internals. Cut the transformer from the prongs that go into the wall socket, leaving enough plastic around them to keep them solid, and taped the terminals up to insulate them thoroughly from each other and the rest of the electronics I was stuffing in the box. Took one of the detectors from the toy (see below), pulled out the electronics and put them in.
Cut off the antenna and hooked up the adapter's cord as the new antenna. Added a battery holder and two AAA batteries. Drilled out a hole for the sensor, slipped it through. Still pretty conspicuous.
To obscure the photocell, cut clear contact paper into a rounded-rectangle shape consistent with the types of stickers you might find on these things. Then, colored over the contact paper with black dry erase marker.
Because this adapter now has batteries (the adapter from the previous prank didn't since it was powered from the box it was attached to), I had to install some sort of switch to keep the batteries alive. Otherwise, I'd have to tear that thing back open all the time to replace them. I really didn't have a way to solidly mount a tiny switch, so I slipped a tiny reed (magnetic) switch in the seam. It's powered on by sticking a really tiny rare-earth magnet on the switch.
- The device should roughly detect motion, specifically a rapid change in light intensity as opposed to merely detecting dark. This was nailed with the solution to the wireless problem described next.
- Wireless would be ideal. This appeared to be beyond my skills and/or experience in electronics, but I found a toy at Toys 'R' Us that, appropriately modified, was wireless and sensed motion. It's called a SpyGear Wireless Tracker. This also had some issues when used in this way. For example, when triggered, it had a blinking red light (LED) that continued to blink until manually reset. That made my job with the glue circuitry a lot more complex.
- Plays back a recorded message. This appeared to be the easiest to accomplish. Radio Shack, in fact, sells a voice recording/playback module for about $10 (above). This device was almost perfect. The difficult thing about it was that it had a rubber plunger switch to playback with no leads or any apparent way to hook into and trigger it from other circuitry. So I had to take the switch apart, drill tiny holes in the PCB for the wires and solder them directly onto the traces underneath. Not ideal. Sound quality is not great, but then I didn't really expect that much.
The receiver mess (with voice record/playback unit) in nearly completed form.
Getting the glue circuitry working correctly was a pain, in the end I went for an inefficient but dead simple approach that made all the pieces work together. Maybe in the future I'll refine it. Anyhow, for now, I'm pretty happy with it.
More to come on the fun we've had.. !
Tuesday, July 08, 2008
Video. Fine moments in pranking at EA Tiburon. You'll see narration by my former manager, Rob Hyder, who pulled off perhaps the finest prank ever here - The Disappearing Door.
Our prank may not be as cool as some of those, but we're still refining it. The target? Fazeel Gareeboo, my manager.
Fazeel was out for two weeks, so he was likely expecting some sort of prank. So we had to do something obvious. There's a certain report that Fazeel hates, so of course we papered his cube with it.
2. The Idea
The real idea was to set up an annoying noise that went off whenever the target scooted his chair in to work. Fortunately, Fazeel sits next to the window. So I made a detector that set off a sound whenever a shadow is cast on it. Camouflaged the detector by hollowing out an electrical adapter and used the wire from the adapter to connect with the control circuitry and the buzzer.
Then we taped the buzzer box (yes, made from a Wal-Mart soapdish) to the back of Ryan Burkett's monitor. The box sported a switch so that Ryan could flip it on and off at will.
We got three solid days of torture out of that baby! Yeah!
So, of course, I built a bigger, badder, wireless version with voice playback that I'll post about later.
Monday, June 16, 2008
Created a new blog for stories and entries that don't really belong here. It's called Illustrated Man and can be found at http://senzee-stories.blogspot.com/. Moved La Casa de los Ancianos, Future Headlines and Language Vector over there. If you can't find them here anymore, now you know why.
Did a little pranking at work, as is the custom when a coworker leaves his desk unattended while away on a vacation. Report on that coming shortly.
Other stuff going on too. Can't really go into much detail about it at the moment.
Friday, April 11, 2008
Another, often forgotten strength of C++ and of many traditional modular and modular-turned-OO languages is linking, or more generally, we might say 'package management'. C++ innately offers build-time linking and runtime 'linking' is also usually available (i.e., DLLs). This allows graceful scaling. Tool support for this in C++ is strong and reasonably robust because C++ is so frequently used to build enormous projects. It's possible to build massive projects in pieces in ways that are awkward or impossible in many functional or logic languages.
C++ good at linking and package management??? That's the most ridiculous thing I've heard all week. This is one of the weakest aspects of C++. Where to start:
* no standard ABI for compiled code. ("Don't mix compilers")
* no standard for managing namespaces and packages like in Java.
* Very hard to evolve a class and keep binary compatibility. Lots of obscure compiler depend rules need to be followed to do this. ("Change a .h file, rebuild everything that uses it")
* dynamically loading code modules has to be done manually. Java, Python and other languages load classes on demand.
* C++ code can call C code, but any other kind of mixed language development is a nightmare.
* easy to break features like dynamic_cast when loading share libraries. (My fun debugging adventure this last week...)
I think Simon makes some excellent points here. So I'd like to ask the community.
What is the best language (and what are the best language features) for linking/loading and package and library management?
Thursday, April 10, 2008
One social networking site is better than many.
I have circles of friends in MySpace (a circle of 0), Facebook, LinkedIn, Classmates.com and Reunion.com. The categorization is artificial. I'm friends with people I know who also happen to be members of the respective site. Take Classmates.com for example. My main group is anyone who attended my school the years I did. Thinking honestly back to high school, there are people I liked, people I don't remember and a dozen levels of nuance in between.
Classmates.com and Reunion.com are dead. They are made utterly redundant by Facebook. LinkedIn is a better alternative to exchanging business cards but could easily be swallowed up by a site with a more flexible, natural and/or automatic way of specifying circles of friends and/or privacy policies. MySpace, on the other hand, just sucks.
Until social networks get wise to the psychology of social interaction, they won't be really useful.
I feel uncomfortable mixing my different circles of friends at parties without making sure that they can make some sort of connection. Categorizing relationships is something that we do effortlessly and unconsciously. Determining what to share with who, when and where is a complex bit of emotional processing that either comes naturally or not at all. For a social network to be really useful, it's going to have figure this out without us having to explicitly group all our friends. After all, these groups - especially the heavily political circles (I cringe for the oceans of middle and high school students on these sites) - are transient with shifting boundaries. You may say that asking a machine to understand this is asking too much, but I disagree. Given enough information - for example the phone logs in your mobile phone - and the behaviors of everyone else who has ever used a social networking site, existing and proven statistical algorithms such as Amazon's book recommendation could automate the largest part of this (Note that statistical algorithm these days is a politically correct euphemism for AI).
I don't want a site. I want a social widget that I can drop into and enrich any technology I want.
The idea of going to an actual website to share anything with friends is absurd. What's more absurd is having to go to five of them. I need a social network component that I can drop into Madden, my phone, my emailer, my IM application, my videogames, my browser, my newsreader, my MMOGs, my iPod, my television, my casual games.
Would this be web component like the Google search bar at the top of every site? Under the hood, maybe I'd expect a sort of service or protocol like LDAP (social://facebook.com?paul_senzee) or a central social network database with intelligence. Of course, this implies that..
Whether one or multiple social networking companies prevail, there must be a standard data interchange format for social network information.
Companies that maintain databases of files of this format will learn a lot about who we are and..
We are known by the company we keep.
The individual as a social being implies that certain types of personal information are best and unavoidably contained in a social network. Then why not keep all of it there? Experience has shown that people are willing to forgo certain privacies for much greater benefit and it's easier to hold one company accountable for privacy violations than many. In this way, social networks could allow advertisers and vendors to present me with exactly what I want without having to give my information to 1,000,001 different parties or revealing my identity.
I want a web I'm interested in, and that my friends are interested in. Social networks can be great active filters.
In 1995, the idea of posting a website and speaking to an eagerly listening world was exhilarating. Now it's noise. I want things I'm interested in to come find me, because there's no way I can sort through the enormous mess of the internet efficiently. The social network knows the interests I share with each friend.
I'll find a story or site that interests me and tag it with a button in the browser. That percolates to my friends with similiar interests, not to the world at large, and not to my friends that are uninterested in the topic. If I'm playing a 360 game and I get stuck at a spot it would be cool if the game would show me how my friend passed that spot.
We have one global web. How about a billion personal webs filled with information actually relevant and interesting to oneself and one's social circles. A much, much more useful internet. Of course, I'll always want to be able to opt back into anonymity.
We need to publish information with the sensitive parts [deleted]. We want to publish a highly targeted message.
Imagine that we could share information with subtle variation based on the group of people that we're sharing it. If advertisers can do this to us all day long, why can't we all?
A new blog post..
After [show for="EA Employees" alt="our new game"]
For EA Employees it says,
After The Beautiful Princess Tea Party comes out on 09/27/2011, I'm going to take a long vacation in the Keys.
For my family it says,
After our new game comes out, I'm going to take a long vacation in the Keys at the Gardens Hotel.
See, The Beautiful Princess Tea Party is going to be a big surprise for my girls. Finally, for everyone else it says,
After our new game comes out, I'm going to take a long vacation in the Keys.
In this case, blogger.com would query the social network to process this page before it's served up. As you can see, to be really, really effective the social network must be an organized, standardized service at the internet's foundation, instead of a handful of ad-hoc websites thrown up to cash in on the Web 2.0 rush.
That's the web I see coming.
Thursday, April 03, 2008
Wednesday, March 19, 2008
Just so you know, the below stories are fictional..
Student Discovers O(log log N) Prime Factorization Algorithm
COMPUTER SCIENCE NEWS (March 19, 2009) - Brad Halloway, an MIT graduate student, claims to have discovered a prime factorization algorithm that has a guaranteed complexity of O(log log N) where N is the number to be factored (or O(log N) where N is the number of digits in the number to be factored). Researchers are eager to see the algorithm, which Halloway will present in person in November. A number of researchers have provided him with several million-digit numbers, all of which are products of two very large prime numbers. Halloway claims that he was able to factor each of the numbers in about one week. The researchers verified that the factorizations were correct. "If this is true, there will be earth-shattering consequences for computer security .. (more)
MIT Student Found Dead
NBC (March 22, 2009) - An MIT student, Brad Halloway, was found dead in his apartment today, victim of an apparent robbery. The perpetrators carried off a computer and a video game console. This is the latest in a recent string of so-called "Video-Game Burglaries" around the campus. "This is the first where someone has died - he must have surprised the perpetrator," said Capt. .. (more)
Intel Delivers 128 Core Processor
PRESS RELEASE (November 6, 2013) - Today Intel Corporation ships the first of its 3.2GHz 128-core processors manufactured with a new 16 nanometer process pioneered .. (more)
Protein Folding Fully Modeled
Human Brain Functionally Described
SLASHDOT (April 19, 2015) -Scientists today unveiled a functional model of a human brain running on one of the world's most powerful supercomputers .. (more)
Half of World's Population Found on the Net
CNN (January 1, 2016) - It is now possible to find more than fifty-percent of the world's population on the internet. .. (more)
Wireless Cortical Device Demonstrated
MSN (August 12, 2017) - The Cortico device announced this week provides a complete-sensory brain interface to its recipient. Cortical Labs also introduced the surgical equipment necessary to implant the interface. Cortical states that the equipment is self-sufficient and requires no human surgeon to oversee the procedure. .. (more)
Computational Model of White Blood Cell Complete
JOURNAL OF COMPUTATIONAL BIOLOGY (July 12, 2019) - ".. this is a monumental day for computational pharmacology .." (more)
Games Jack In
TIME (June 23, 2020) - A new generation of games are set to hook directly into several leading cortical interfaces for an experience that's "far more reality than virtual," .. (more)
Immortal! Gene Therapy Makes Wealthy, Healthy Immortal
USA TODAY (May 23, 2026) Front Page - It is a move that critics compare to the rich trampling the poor in a rush to leave the Titanic. Yesterday, the FDA approved gene therapy for rendering those (very few) who can afford it - and are in good health - biologically immortal. The subject of intense controversy since deemed technically possible only seven years ago, it has only now .. (more)
Computational Model of Human Body Complete
JOURNAL OF COMPUTATIONAL BIOLOGY (October 3, 2029) - ".. we've obtained the Holy Grail of computational pharmacology .." (more)
New US Childbirth Policy Follows EU
USA TODAY (February 8, 2040) - Following in the footsteps of the EU, the United States today introduced a childbirth policy that requires government permission to bear a child .. (more)
.. and this list is subject to change without notice. ;)
Monday, March 10, 2008
The 90's began a different trend in science-fiction. In one episode of Star Trek: The Next Generation, Picard returned to a modest country home in France that could have come from the 1950's - or the 1850's. Gattaca and Minority Report contrasted simply and starkly adorned futures with warm, familiar early 20th century fashion and interior decor.
The later examples are a truer glimpse of the future. Nobody wants to live in a world that's metallic, austere, cold, dirty or gaudy. Developers of technology in our world strive for naturalness, unobtrusiveness and basic elegance in design. The less visible, the better.
Consider the ubiquitous computer. It's so pervasive that even talking about 'computers' seems anachronistic. In the 80's people would say about an IT professional, "he works with computers." Now they say, "he's a game developer," "he's a graphics programmer," "he's a web designer," or "he works in information technology." And everyone knows what those things mean. Back in the 60's and 70's, computers were novelties, understood by few. A computer was a sure sign of the future. Today, we spend lives sitting in front of them. We know them for what they are - tools of immense and perpetually untapped power that also happen to be phenomenal pains in the ass. What is the future we hope for now? It's a future where we can reap the benefits of machines and be free of the frustrations they cause us. So a computer doesn't look like the future anymore, it looks like the frustrating present. And, if you draw too much attention to it, it starts to look like the past.
What does this mean? Technology must disappear. No, it will not go away, but it will become invisible. Take the beautiful iPhone. All those complicated little buttons you find on a smart phone? They just went away. It could only be better if it wasn't there at all, while still doing all the iPhone allows. Research proceeds rapidly in flexible electronics, stretchable electronics, wearable computers, transparent electronics, nanotechnology and most significantly, brain-machine interface, so this may come sooner than we think.
Wednesday, March 05, 2008
For now, however, I'd like to look at the Cell. In doing so, I'd like to ask and try to answer the question, "Why has the quality of PlayStation 3 titles lagged behind that of the Xbox 360 titles?"
The Cell Processor is Amazing
From the info I gathered, the PS3's Cell should have a peak numerical performance of about 205 gigaflops. From Wikipedia, I gather that the 360, on the other hand, has peak performance of 116 gigaflops. The PS3's Cell has one regular PowerPC core (with two hardware threads) called the PPU. It also has seven SPUs (one is reserved for the OS), which are streamlined vector processors that are fast as hell, although functionally limited compared to the main PowerPC core. They have their own memory at 256K apiece. Half of that is available to the developer. The 360's Xenon processor, on the other hand, consists of three conventional PowerPC cores almost identical to the Cell's PowerPC core, for a total of six hardware threads. So, if you're like most people, you're probably thinking, "Wow, the PS3 blows the 360 away!"
Except that it doesn't.
Developing Games is HardI just read this at .mischief.mayhem.soap about the size of the Assassin's Creed team. Actually, I think that's about the same for Madden NFL. Modern games require an enormous amount of content, some of which must be reflected in code. Games on multicore consoles are much more difficult to debug. Making games is simply much more expensive that it used to be. I heard a colleague remark that he saw that one PlayStation 1 version of Madden had 30 files. Now, it's thousands of files, millions of lines and scores of developers.
When we first saw the specs for the Cell, I thought, "it looks like it was designed to play movies." And in playing movies, the PS3 will blow the 360 away. The Cell was designed for performing the same kind of vector calculation on large numbers of small chunks of data. Essentially like a GPU, but more flexible although less parallel. There's a lot of this sort of thing in games. Some physics, audio decoding/encoding, movie decoding/encoding, some graphics/geometry processing. Still, there are a lot of things that aren't well suited for this model. Among those would be language interpretation (we do a lot), gameplay code, sparse data manipulation, and a lot of really mundane and common software engineering things that are inherently sequential. Sorting a list of pointers that must be dereferenced to be compared, for example.
Explanations of why the 360 performs so well compared to the PS3 usually start out like, "Well, the Cell looks better on paper, but.." Developing for a 360 is like developing for a multicore PC. Take your existing code, multithread it and know that it'll work on the 360 and upcoming PC-like multicore machines with little changes.
You can build PS3 software exactly the same way using only the PPU and its two threads, but that will yield almost exactly one-third of the power of the 360's processor. In fact some early titles that were developed first for the 360 did just this. This was not as bad as it sounds though. A lot of early code for these machines was not thoroughly multithreaded because much of that code was directly ported from older Xbox and PS2 code. In these cases, it used just a portion of the 360's power anyway.
Now we're pushing the capabilities of the 360 and trying hard to get the PS3's six fast but special-purpose SPUs to do what we can do with the four remaining general-purpose hardware threads on the 360. It's hard, though certainly possible to achieve parity. Still, it requires a lot more code and expertise just for the PS3 (usually from our best developers), which typically means much less overall developer productivity.
The Cell Looks Like the Future
Unfortunately, as delightful as it is to develop for, the approach taken with the 360's Xenon processor simply can't scale. At some point, with some number of traditional cores, memory access and synchronization are going to break down or become so contentious that any benefit of a larger number of cores will be nil. So the Cell looks like the future. A small number of traditional cores with a large number of stream processors can yield phenomenal performance in the right proportions. I predict that some member(s) of the next console generation will have a processor with 8 traditional cores and perhaps 256+ stream processors. 256 sound ridiculous? Consider that an ATI HD 2900 XT GPU has 320 stream processors.
Why do I think the Cell was premature? It did not fulfill the possibilities of traditional multicore processor before moving on to a hard-to-develop-for, though potentially more powerful, architecture.
In any event, it's clearly a foreshadow of processor architecture to come. Still, the difficulty of achieving parity between the Cell and 360 Xenon when game development is already at its most taxing really isn't doing the PlayStation 3 any favors.
Tuesday, March 04, 2008
Recursive template engine with escaping and lazy evaluation. Note that it is case SENSITIVE.
1. Add items to Templater's map.
templater.map()["bob"] = "joe";
2. Evaluate a text item that references a map item.
s = templater.eval("well, hello [!bob]");
Returns "well, hello joe" in s;
I asked him a ton of questions about it, 'cause I'm working on similar ideas. Anyhow, please check out the city generation stuff he's doing. He's working from Muller and Wonka's shape-grammar/L-system generation ideas and producing some awesome work.
Check out this beauty -
Networking. No. It's hard to schmooze when you're coughing every 12.2 seconds and trying to keep your nose from dripping on your new friend. So I didn't meet a lot of the people I'd hoped to. Who I did meet, I tried to keep my distance and avoid shaking hands so as not to pass on the scourge. Still, it wasn't a total loss. It was the first time I'd been and it was fascinating. The keynotes were great, including a compelling Ray Kurzweil. I didn't feel well enough to focus enough on the technical sessions to get much out of them. Still, there were a lot of ideas in the air. I also attended several game design and art sessions.
The WiiMote and Guitar Hero/Rock Band must have spawned a lot of thought about input devices in the industry. Depth cameras and a bunch of other technologies were on display. We had NeuroSky and Emotiv Systems with their bio-sensor technologies (see the following post on Emotiv Systems).
Physics Engines, UI and Procedural Texture Technologies
A lot of middleware vendors especially surrounding Physics and UI. For physics, we had the Octave Engine and others. For UI - Scaleform, Anark, MenusMaster. Scaleform seems to be quite popular and is a Flash-based UI system. I was taken by Allegorithmic's procedural texture technologies. Oh, and motion capture.
Everyone was talking about BioShock. I've not yet started playing it, although I do own it. I'm still playing Assassin's Creed right now which is phenomenal in its own right. I'm excited to get at it. I attended a session with Ken Levine, BioShock's designer.
PC World Video, Recap
I already wrote about Fable 2, though I'll probably amend that (and this) entry as I get more time and reflect a bit more about it.
Monday, February 25, 2008
He announced four new features which are on Wikipedia already, so for many of the details you'll want to head there. As a part of the demo, Peter demonstrated his onscreen counterpart, The Molynator who, like a solid 10% of the audience, was female. Fable 1 had just one possible (male) protagonist. He announced that The Molynator married (after meeting at a courting spot, or something like that) five years ago. They'd found a home in the slums of Bowerstone early on, then moved to a country residence a few years later. The young couple had a young child in that time. Yes, Molyneux said, she experienced all the joys of virtual pregnancy, including the growing belly.
He said he didn't know if the big belly would actually make it in the game. Something about a nine-month pregnant woman slicing open a dragon or troll with a broadsword strikes me a little inauthentic. Especially if you've ever seen (or been) a nine-month pregnant woman trying to get out of a chair. Also, might it be bad form to have a pregnant woman defeated in battle?
Imagine you choose to have a family in Fable 2. Molyneux strongly hinted that your family will become a part of the narrative. He glossed over this one really fast, though. Your children, your spouse kidnapped, murdered? Avenged? Who knows?
Co-op play inspired by LEGO Star Wars, but much more dynamic. Rebekka and I can play Fable 2 together! So now we won't have to worry about her getting that big belly again. ;) Well, until we finish it. He didn't say anything about the length. Hope it's longer than Fable 1.
There's no overhead map. Instead, there's an innovative breadcrumb trail showing you where you've been most.
Check this out. No money for completing quests? Arghh. You either have to gamble or be hired as a henchman to make money. That just doesn't make any sense to me. I have to get a job to make money? That's way too much like the Sims. Which means way too much like real life. I'm not much of a gambler either. The weird thing is I already have a job. It's the one that I'll use to pay for Fable 2. And I like it more than gambling or being a henchman.
One of the gambling games, Keystone - a roulette like casino game - will be coming onto Xbox Live prior to the release of Fable 2 so that you can make your money before you need to spend it.
I'm excited to play it. I can wait though. I've still got Assassin's Creed and BioShock to work through (not necessarily in that order), so The Molynator's got plenty of time.
More info on the presentation.
Monday, February 11, 2008
That article represents the solitary voice screaming at the tsunami to keep its distance. We humans want to be entertained. We go to great lengths to build entertainment technology and we've always adapted non-entertainment technology to serve the purposes of entertainment.
In any event, I've long held that good-quality games are excellent teaching tools. My eight year-old son is a strong reader. He's always loved games and has (coincidentally) been drawn to games that require heavy reading. Nintendo fans know what I'm talking about. Because of the games, he was motivated early on to read, and the interactivity of games pushed him further in that direction. Now, paradoxically, he loves reading novels (but still plays games!).
Here's another example. My children love Tim Schafer's phenomenal Psychonauts. About a week ago, while getting the kids to sleep, I found my six year-old daughter sitting on her bed deep in thought.
"What are you thinking about?" I asked.
"Daddy, why did Napoleon lose at Waterloo?" She asked.
I laughed, "I don't really know, but we can look it up.."
I don't think her first-grade textbooks could have provoked that question.