Thanks for the interest in our shared video gaming past. I had a lot of fun making that video. The PS1 was a fun machine as it was capable, complex enough that you felt it had secrets, but not so bizarre or byzantine that you felt learning them was a waste of time. And you were pretty much the only one in there as the libraries were just libraries, not really an OS. Still true of the PS2 although that was a complex beast, but by the PS3 there was more of a real OS presence. If you want some more, slightly different, slightly overlapping info on the PS1 or making Crash, I have a mess of articles on my blog on the topic: https://all-things-andy-gavin.com/video-games/making-crash/
Oh wow, this is akin to spotting a celebrity out on the street!
I happened to already be half-way through the extended "war stories" interview on the making of Crash you had done and it is superb; you are a joy to listen to! I remember reading these blog articles of yours many years ago but will definitely be revisiting!
As an aspiring hobbiest game developer, I often feel I have missed out on this golden age of game development - where you really had to think outside the box and hack your way around the architecture to achieve your design and performance goals.
I really enjoyed your "Making Crash Bandicoot" blog posts and the "War Stories" video. I would love to read about your work on Jak & Daxter and working on the PS2.
This was the first American title that was heavily produced by Japan. Before us, Tetris was the only external game that had sold really well. Across all consoles/machines!
Starting out is hard now in console gaming, except "maybe" x-box live or something. You'd have to start on iPhone, Android or the like where the costs are lower.
over a year cool and thanks for the response! Mr Cerney on here too? aint spoken to him in ages :)
Yup yet I remember when ppl were complaining how PS2/Xbox games x10 game budgets and all hell was breaking loose with the industry doomed. These days iphone/android have technical constraints on par if not more so xbox/ps2 thus making a highly polished game on a phone hell of alot cheaper than AAA console title but still not something you can do in your basement.
Guess my question is what type of game would you start out with? AAA level phone game with 80H+ of gameplay and 10 levels will cost a lot to develop and no one will play past the first level, or pay more than $5 for - IMHO ofcourse :)
Well pause in video games is actually kinda complicated. By the jak period I had a pause mask (64 bits) that could pause all sorts of independant parts of the game separately. Particles, Camera movement, texture cycling, light changes, enemies, etc.
Sometimes you just want to free the characters in place (and usually enemies and the like) but don't want the whole screen going motionless. For example in Crash, when Aku came out, I wouldn't want to pause the fruit from flying to the score. That would look odd. The Aku pause is to give the player a breather, not just to pause.
Also, the real pause brings up the pause menu, which you certainly don't want with Aku.
Haha. Ken Kutaragi complained of Crash in the very early days that the "trees should wave their branches at you, giving the nostalgia of childhood." Or something like that. At the time we found it merely puzzling.
Natsukashii is one of those Japanese concepts that you just have to learn along with the language, a lot like itadakimasu or the various forms of yoroshiku onegai shimasu. It's not so hard to understand once you understand the feeling they're supposed to convey.
I usually end up mentally un-translating them because the translations end up sounding unnatural in English. Unusual uses of words like "nostalgia" and certain other bits of "translationese" make it easy to spot translations and identify the source language and not just for Japanese. For example, unnatural sounding uses of the word "illiberal" usually indicate statements translated from Chinese.
There is such a difference in coding output. Having had perhaps 50 work for me over the years (and being one myself), the top guys do perhaps 10x the output of the merely "very good" guys. And near infinite with the mediocre ones who on tough projects actually suck more time than they contribute.
The good guys also come in and contribute right off the bat. Like Christophe Balestra, who now is co-president of Naughty Dog. When he arrived on Jak 2 he was pounding out real working stuff the first or second day. By the end of the game (one year later) it was clear he was so kick ass that we promoted him across like 15 others guys to be co-lead with me on Jak 3. And he continues to kick ass to this day. I just site him, but I had the pleasure to work with around half a dozen other totally awesome guys too. Still, the "good" guys will take a system and do a great job with it over weeks. The great guys will knock it out in like 24-48 hours.
Lots of articles on this kind of stuff at my site too:
I've found that a difference between the "top" guys and the "very good" guys is that the very good guys are smart enough to do great work, but just aren't as into it. I know many guys who were mathematicians, physicists, or near-professional violinists and fell into programming because they couldn't make money from their other interest. These guys are often really good, but you can tell they wish they were doing something else. That said, the worst dudes are the ones who are really into it, but are really bad.
For the poor performers they may be nothing you can do. But for the merely "very good" are there practices that Balestra can teach them?
For example, one thing I've seen that can increase productivity by a factor of ten is good debugging skills -- which are generally teachable. The other thing is to get people on things they're excited about. Mentally checking out is another area I've seen strong people lose time.
Good debugging is key, and as anyone who ever worked with me will note, I'm a fantastic debugger (in no small part because I'm cold, rational, and rarely get upset). I keep meaning to write up a post for my blog with "Andy's rulez of debugging." There are really very simple, but very effective.
Like: "don't assume" and "divide and conquer" (they do require a bit of explanation)
I'll have to check that out. Although my advice will be free :-) But I'm sure that many many other good debuggers have developed the same basic techniques independently. Still, the vast majority of programmers could use some improvement in this area. "Quit thinking and look" is exactly what I mean by "don't assume." People tend to get wrapped up their own view of things and forget that empiricism really wins the day. There is often even fundamental denial, as in "what bug? I haven't seen it." Clearly if someone saw it, unless they were hitting the crack pipe, it's real.
I often find that looking at the code hard and adding the appropriate tests and asserts can be better than immediately pulling out the debugger. Assert are continually testing your assumptions where you can only see them once in a debugging session.
Debugging is almost always about uncovering wrong assumptions, but I don't see how you can do without assumptions. Every line of code assumes certain state of the program that it operates on.
You can't have no assumptions. But half the time I help someone debug something they begin with, "it can't be in this part of the code" which is often unfounded. Now if you PROVE that it isn't, that's a different matter.
I'm not a LISP hater, far from it, I love the language.
In 2006 I wrote a complex multi-threaded socket and web server that talked to MySQL. I wrote it first in ACL, ported it to CMCL, then to Ruby, and then parts of it (with another programmer) to C (ultra high performance after that). So I got a head to head comparison for the same task.
As to libraries in 2006 (and Ruby libs are only better now). It was easy in Ruby to find libs to talk to third party APIs like Twitter, Facebook, Photobucket, etc. None of this existed in CL. They might be quirky, but they were there.
In ACL/CMCL "core" libraries like sockets and database access tended to be missing a lot of "fringe" (not really that fringe) features like transactions, multiple database support, enums, bigints, etc. But worse than that they tended to mysteriously hang under moderate volume. I found this true on both ACL and CMCL, different libs.
The equivalent Ruby libs, like ActiveRecord, had some SERIOUS quirks, and were missing some of those "fringe" features (I added a lot of them like multi-db, enums, and bigints). But fundamentally, they were more modern in design and reliable. ActiveRecord almost never crashed or hung. Yeah, it did some crazy and stupid things, and performance was a problem. But it didn't hang. When writing real production code mysterious hang/crash/corruption bugs are just a deal killer.
I'm very sympathetic to this exact problem. I love writing code in a Lisp, the language is a beauty. Working in rendering/visualiztion, there are no real lisps that are usable for the performance characteristics that we need, which is a problem in itself, but I do find that I am able to write much of my personal code in more dynamic languages.
For a while I was trying to get practical code written in scheme, but it then became clear that there were probally about 40 people in the world trying to write actual usable code in scheme, and not just using it academically, eventually, I gave up, and started using CL, because at least, there are and/or have been, people trying to create real products with CL, and there are real sets of libraries.
But again, every once in a while, I decide to write a side-project in Ruby, due to one reason or another, existing libraries, or working with others, and the simple fact of the matter is that atop of Ruby being a reasonable approximation of everything I want from Lisp, macros aside, it has, simply a much more mature set of libraries. In my experience not only am I more likely to find a specific library for Ruby, but it is much more likely to be well tested and production-ready.
Sorry you had trouble. I'm glad to say there are more libraries available now than in 2006, and the picture, both in terms of volume and quality, keeps improving.
The GOAL compiler, yes. The GOAL runtime no. GOAL itself produced high performance code without arbitrary GC, with simple (semi-static) runtime type checking etc. It was designed for console runtimes. I didn't write Jak & Daxter in CL, but in this Scheme dialect language (the compiler was written in CL/CLOS).
For nearly two decades I was a diehard LISP advocate. I even forced all my programers to code three Crash Bandicoot and four Jak & Daxter games in custom LISP dialects that I wrote the compilers for (an article on one here http://all-things-andy-gavin.com/2011/03/12/making-crash-ban...).
But by the mid 2000s I started doing the kind of programming I used to do in LISP in Ruby. It's not that Ruby is a better language, but mostly it was the momentum factor and the availability of modern libraries for interfacing with the vast array of services out there. Using the crappy unreliable or outdated LISP libraries -- if they worked at all -- was tedious. Plus the LISP implementations were so outmoded. It was very hard to get other programers (except a couple enthusiasts) to work that way.
Ruby struct a decent compromise. And it's type system and object model are better than CL anyway. The syntax is more inconsistent, and the macro model nowhere near as good. But it turns out. Libraries and implementation matter a lot. Still, you can feel lots and lots of LISP influence in all the new runtime typed languages (Ruby, Python, etc). And 30 years later, listeners still rule!
Have you investigated Quicklisp[1] using SBCL? I have found Quicklisp generally a better experience than CPAN, pip, Cabal, and the other package managers I've used.
SBCL (A fork of CMUCL) is maintained monthly and recently had a very successful crowdfunding campaign.
Of course, if you find CL itself to be weaker in areas important to you than (say) Ruby, none of the above matters. But I thought you might like to know. :-)