ps2 with lcd screen free sample
The PlayStation 2 games can support HDTVs and EDTVs as well as the 16:9 widescreen mode. Generally, progressive scan mode is activated by holding the Template:PlayStation key press and Template:PlayStation key press, buttons down after the PlayStation 2 logo appears. When this is done, the game will typically load a screen with instructions on how to enable progressive scan. Many games only offer progressive scan through this method, offering no related options in the game"s options menu. Both methods work on a backward compatible PlayStation 3 as well.
Component video cables are intended for the ED and HD modes, with the color space being set to YPbPr in the system settings. While a (SCART) cable wired for RGB does work, the console switches sync to RGsB (sync-on-green) on 480p and higher; this sync setup differs from the standard VGA RGBHV and will only display on certain compatible monitors.Template:Ref
At the PS2 internal System Configuration menu, the Screen Size option allows for a 4:3 or 16:9 (widescreen) display, however, most games do not poll this option for enabling widescreen, relying on internal options instead. As with progressive mode, widescreen mode can also be forced. Using the products of the PlayStation 2 emulation and homebrew scene, many games that don"t directly offer a widescreen mode can be patched to use a true 16:9 aspect ratio. This is achieved by one of two methods: using cheat codes with a cheat engine like the commercial software Code Breaker or the free Homebrew software PS2rd; or by modifying the game executable permanently with a hex editor.
The most important things you can do to improve the clarity, color, and brightness of your PS2 games is to get component cables (if your HDTV will accept them) and optimize the HDTV using a THX certified DVD/BluRay.
When you change the aspect ratio of the game to widescreen, your TV does not automatically adjust. You"ve have to manually switch the TV to 16:9 (aka Full Screen). Don"t forget to change back when you play a game in 4:3 mode.
Some games, like Ratchet and Clank 2: Going Commando, have a widescreen mode takes the easy way out and just chops off a lot of the top and bottom of the video to make it fit the 16:9 aspect ratio. Be sure to check any game you suspect this of. I preferred how the game looked in 4:3 mode. I still set the TV to Full Screen. There"s some distortion, but not enough to bother me.
The main drawbacks to improved video quality via better cables and HDTV are jagged edges and graphical artifacts becoming much more apparent. PS2 games were designed for the brighter, more saturated look of cathode ray tube televisions, not the clearer, sharper but less rich LCD screens. Some tweaking of the TVs settings can help here. First, most if not all HDTVs store separate video settings for each input. So changing the brightness on the component input won"t ruin the cable TV settings.
Template:Note TV Type (Standard Scan/Progressive Scan) is determined only with the Guncon2. If the Guncon2 controllers are connected to both USB connectors, the TV type will be determined by Player 1. If Player 1 is using a Guncon and Player 2 is using a Guncon2, the TV type will be determined by Player 2. Note that the Guncon2 only works on SDTV and EDTV, but can still enable 480p on HDTVs
Template:Note Fake widescreen: The widescreen mode only affects the scaling of the menu and HUD elements, the in-game field of view stays the same (meaning the image gets stretched).
Template:Note Despite the game being presented in a cropped widescreen, the game is displayed in a 4:3 aspect ratio. However, using the 4:3 zoom function of the monitor if available, a fuller widescreen display can be achieved.
Template:Note HD modes (480p and 1080i) are switched constantly to 480i whenever something non essential is on screen (menus, movies, etc.). HD modes need to be enabled every time the game is started at the in-game option menu as the setting is not stored in the game save although the screen ratio option is saved.
The PS2 was a powerful but limited machine. Its speed at rendering to the screen was unsurpassed at the time, but the blending it could do was very limited indeed. Where the Xbox could sample up to four textures and blend them arbitrarily with each other and the current screen contents, the PS2 could only sample a single texture, and either add or alpha blend with the screen. In terms of per-vertex calculations, the PlayStation 2 had a general purpose processor (called VU1) to process vertices, compared to the Xbox’s vertex shader which had a limited instruction set.
As mentioned in the introduction, we were able to leverage Argonaut’s existing PlayStation 2 technology, notably the code that ran Lego Bionicles and I-NinjaSuperFX chip). The Xbox clipped geometry in hardware whereas the PS2 had to use software clipping, and would crash horribly if got it wrong and let it draw too far off the sides of the screen.
The PS2 had an unusual feature — it supported two separate hardware rendering contexts. A context held settings like texture modes, blending modes, render target and so on. At render time, a triangle could be submitted through either render context. Okre used this to perform lighting and texturing at the same time for dynamic objects: one context had the settings for the texture of an object, the other had the lighting settings. The lighting context was set to add to a different buffer, so by the end of the texturing and lighting pass we would have two screen-size buffers: one would have the unlit, textured scene, and the other with the lighting.
Before rendering we initialised the lighting buffer to the ambient colour. Then we added on any self illuminated surfaces, as we did on the Xbox. The PS2 hardware didn’t have all the clever pixel-counting and early outs for Z failures, so we had no “Z stamp” pass.
The tricky bit then came as we needed to combine the textured scene with the light buffer by multiplying the two together. Unfortunately the PS2 doesn’t have a multiplicative blending mode: the best it can do is multiply textures by a constant value and by either alpha or (1-alpha). Even worse, its idea of “multiplying” two values together is actually (x * y) >> 7 — that is, the result is double the value you’d expect. (Multiplying two 8-bit values together gives a 16-bit answer, and usually you’d shift this back down 8 to get an 8-bit answer out). This “feature” allows you to brighten things by up to 2 times during lighting (for mock specular effects), but actually throws one bit of accuracy each multiply. Aargh!
The trick is to “lie” to the hardware about the type of texture you have. By pointing the texture unit at the 32bpp light buffer, but saying “hey, it’s an 8bpp palettized texture”, it was possible to read just one component out (i.e. either red, green or blue). Setting an appropriate palette of (1 in either red, green or blue as appropriate, and index/2 in alpha) would then look up the 8-bit value and get a neutral colour with an alpha value of the “actual” amount. This would be used by the hardware as the value to multiply with, and so you’d get a single component multiplication. Repeat this for green and blue and you’ve done a full screen multiply.
This was made even more complex by the fact the layout of the buffers was not a contiguous ARGB, nor a simple planar format. Instead, when interpreting a 32bpp screen as if it were an 8bpp texture you saw a 16x4 block of pixels for each 8x2 source block:
Selecting a single component from the buffer (when the buffer is viewed as an 8bpp texture), means crafting thousands of quadrilaterals that cherry-pick the right part for each component. We took advantage of the fact the pattern repeated regularly, and so could prepare a set of rendering instructions offline for a single screen block, and reuse that over and over again, just moving the source and destination texture pointers as appropriate. All this could happen on the VU1 chip, leaving the data bus free for the CPU while this operation occured.
Apart from the pre-calculated scenery shadows, we couldn’t come up with a viable way to do true dynamic shadows on PS2. There’s no stencil buffer or depth buffer support on PS2 and although there are some tricks to achieve both they’re both very expensive. We were already burning so much processing power doing our lighting so we decided against implementing them.
Instead we went with the more game-y black circle shadow drawn under the characters. Using the collision system, we’d find out where the floor was under the characters, and just plonked a shadow circle there. For shadows cast onto the dynamic objects, we did a ray trace (also using the collision system) of several rays from the objects to their nearby lights. The result of this would be an approximation to how much in shadow an object was, and we darkened the whole object accordingly.
In order to get the camera effect (virtual aperture and exposure), we used a similar technique to the multiply trick to look up the red, green and blue component of the screen in an exposure table.
The colour bleed effect was one of the easier things to port over: the PS2 supported subtractive blending, so we could threshold the brightest pixels. Then downsampling, re-upsampling and adding was just texture manipulation and additive blending.
Sampling the brightness of the screen to feedback to the camera effect involved downsampling the screen until it was very small (around 32x32), and then transferring this small texture back from the graphics unit to the main memory where the CPU could read it.
Invaluable in the development process was the PlayStation 2 “Performance Analyser”, which was a gigantic, instrumented PS2 development kit. As far as I can tell it was a normal PS2 with a bunch of very fast RAM and hardware logic probes on all the bus signals. Triggered by a foot switch it would capture all the main signals going on for a whole frame. An application running on the developer’s PC would then let you visualise the captured data. This was handy for finding and fixing bottlenecks, and also at a push for working out why geometry wasn’t turning up in the right place. It required a pretty comprehensive understanding of how the PS2 worked though.
A Performance Analyser scan. It"s the only one I could find: it"s not actually of SWAT, but of Lego Bionicle"s "Depth of Haze" effect. From left to right it shows the graphics commands executed; the location of the last branch the CPU executed; time series of the various control signals; key to the graph. The CPU is actually in a busy spin waiting for the GPU (note the branch panel shows the same locations over and over and the CPU graphs (green at the top) indicate 100% CPU usage - only really achievable in a tight loop with no external dependencies. The rest of the graph is pretty incomprehensible without going into another chapter"s worth of post here.
The Xbox had an equivalent system, except instead of being a very expensive piece of hardware, it used software instrumentation to capture essentially the same data although not at the detail of the PS2. The Xbox’s equivalent went on to become DirectX’s PIX, although it was much more powerful owing to the tool knowing exactly what graphics card it was running on. This meant it could do things like display the captured frame and then allow you to click on a pixel somewhere and tell you exactly how that pixel came to be rendered the colour it ended up — which polygons drew over it, what blending modes and textures were used and so on.
Staggerinly, the PS2 didn’t interpolate its colours taking into account perspective. This meant we had to be careful to minimise the colour change over larger polygons, particularly if they were next to smaller polygons where it would be more noticeble that the colour interpolation was completely different in each case. ↩
I can’t remember how we dealt with normals. I think we just factored them into the colour, but due to the lack of perspective correct colour interpolation something tells me we may have actually ignored normals. ↩
Old TVs in the era of the PS2 and earlier consoles were mostly CRT. LCD screens became standard around the time the PS3 came out, most of the way through the PS2 lifecycle. CRTs don"t have discrete pixels, and tend to blur a bit, so that images don"t look so sharp, and thus you can"t see the defects. That meant that PS2 could get away without so much antialiasing, because the blur of the TV would do it somewhat. CRTs are optimal for having variable resolutions as well, due to no discrete pixels.
LCD screens have discrete pixels, which means that they only look good at their native resolution. If you have a 1080p TV, only a 1920x1080 image is going to look right, as everything else is going to be scaled to fit, which may result in pixel doubling, or other distortion if the resolution isn"t a factor of 1080p. With discrete pixels, the image also tends to be a lot clearer, depending on inputs used to deliver the signal, which shows more flaws in the image. Upscaling an interlaced 640x480 image like the PS2 tends to put out will cause flaws in the image.
Also, the standard cable now is component or HDMI, and component provides the cleanest signal the PS2 can output. Composite was standard when the PS2 was new, and that tended to bleed colors like red, since all data for video was on a single wire, instead of 3, and was analog instead of digital, so it could degrade in more increments. Digital tends to be a good signal or no signal, but analog allows for fuzz in the image, and color distortion.
Really, the best option for PS2 was a CRT with component input and progressive scan support, but those are a rare beast now a decade into the LCD HDTV era. CRT tubes are now categorized as hazardous waste in many areas, due to being made of leaded glass, and so thrift shops often won"t take them as donation, and pawn shops won"t buy them. They"re also very heavy, and thus expensive to ship if you can"t find one locally, limiting your purchasing area.
If you"re going to use an LCD TV, you want to not stretch the image, so you have the black bars on the left and right sides, and you want to use component cables to get the best image you are going to get. Otherwise, you want to use a PS2 emulator via PC so it can natively output at your TV"s actual resolution, but compatibility for those is still lacking in many games, and the hardware requirements are steep. Even a $1000 gaming desktop is going to not play some games at a good frame rate.
Some HDTVs have the option to not stretch the image horizontally, to preserve the 4:3 aspect ratio. Some can also avoid zooming the image, but that means you will have a small box of content in the middle of the screen. Upscaling and keeping the aspect ratio isn"t usually too bad, but breaking the aspect ratio will distort the image.
LCD screens have discrete pixels, while CRTs do not, so games with poor antialiasing (or none at all) will look more jagged on an LCD screen. Almost all HDTVs are LCD, but almost all SDTVs are CRT.
If you dug up a CRT computer monitor that supported higher resolution, it wouldn"t really affect how the PS2 games look (if you had the official VGA cable or a VGA box), because it would set the monitor to 640x480 resolution, so it"s not the resolution difference, it"s the screen technology that makes the difference. LCD is usually poor at displaying content that isn"t its native resolution. CRT is great at any resolution it can support because there are no individual pixels, just a phosphor array and a shadow mask.
Moving house is a great time for reflection. You’ll end up handling things that mean a lot to you, even though you haven’t seen those things for years. And while you may only see those artefacts every Olympiad or so, the reverence with which you touch them, and the smile they bring to your face will mean they will get placed in yet another box, labeled, and transported carefully to your next residence. Once there you’ll find another corner of another dark cupboard where these treasures will be safe.
The PSone and PS2 games pictured above are for me such treasures, and even though I no longer even own a PSone or a PS2, I can’t bring myself to get rid of them.
Believe me, I’m not a hoarder. I’m not even particularly sentimental. I don’t have shelves full of figurines or collector’s edition box sets - not that there’s anything wrong with that - I’m just not that way inclined. Many of my previous console’s games have long since been traded in or become landfill, but the ones pictured above are more like photo albums. Each for various reasons carry specific memories with details that go beyond my experiencing them as simple games that I’ve finished.
I remember in DRIV3R, the first time I ran my car off a bridge and into the water, sitting for minutes thinking that what I was staring at was a “Game Over” screen. After a while I jiggled the left thumb stick and the black blob floating in the water moved accordingly. What? I jiggled again, the blob moved again. Trust me, there was a time when avatars did not know how to swim, and discovering it for myself first hand was nothing short of a revelation.
It goes without saying that I can’t throw out my boxed copy of Final Fantasy VII (or VIII, IX, X, X-2 or XII). The television I played it on in a preceding apartment was a hand-me-down from my parents, and it was big. Not big in screen size, just big in general - that thing was a piece of furniture. Housed in a timber cabinet it had a great flat top on which my pipe collection proudly sat in its wooden pipe holder, which was of course carved in the shape of a pipe. My couch consisted of huge big pillows lined up against the wall facing the TV, and I was Cloud for dozens of hours between the Autumn of ’98 through to late Spring.
Playing Buzz!: The Music Quiz with my then girlfriend (now my wife) with its crazy little game show buzzer peripherals - she’d kick my arse every time, but in my defence, eighties pop music (which the game seemed to major in) was never really my bag either.
So you get the idea. This box of games for which I no longer have the appropriate consoles on which to play them will travel with me like a photo album. I have no interest in replacing my long since departed PSone or PS2, which I could easily do. I’m too busy finding enough time to keep up with modern classics such as The Witcher 3: Wild Hunt. I know that one day I’ll dig out CD Projekt Red’s current masterpiece, unfold the paper map inside the vanilla edition’s box, and think about the apartment I played it in, and the times that I had there.
I tried to become accustomed to very blurred and ugly graphics on PSX/PS2 plugged into my TV with a big screen, but it depriving me of any enjoyment of playing games.
So I"m wondering which type of TV or monitor special dedicated for this old Sony"s consoles. Generally, the most interesting option for me is an LCD monitor/TV with about 20 inches diagonal screen, but I am not sure if an LCD monitor/TV will behave differently than a big TV OLED, which kills all the fun in the game.
If some of you will confirm, that LCD variants can handle video playback in acceptable quality, the next question is about preferred inputs available in devices. Most of the LCD monitors with mentioned dimensions have just VGA and DVI inputs, so I am a little afraid about quality loss due to converters from VGA to SCART/RCA. However, this problem with converters does not exist in the case of the LCD TV or LCD-TV-monitor hybrids, because as I can imagine, they have these inputs.
Although Sony’s London-basedGTA clone was nowhere near as good as we’d hoped – plagued with clunky controls, dodgy design choices, and an awful camera when on-foot – there was still something about it that drew you in.
Perhaps it was the meticulously recreated map of London, the Snatch-style adult dialogue, or the photo-realistic visuals. The Getaway was an entertaining, if often frustrating experience (remember that hair-pulling laser security bit? Oh, good lord). It also had two separate stories, with the completion of Mark Hammond’s campaign opening up Flying Squad detective Frank Carter’s series of events, depicting the other side of the thin blue line.
Torque is sent to Abbot State Penitentiary, which soon gets hit by an earthquake, unleashing all sorts of hellish creatures, which Torque has to deal with.
Have you played Gears of War, Mass Effect, Uncharted, or any one of the myriad of cover-based shooters that saturate the market? You probably have, but withoutKill Switch, you may not have had the chance.
It was a fairly bare bones, budget game, with minimal polish, but it played very well, and the cover system made it stand out, giving combat a big enough twist, and an enjoyable one at that. It’s worth playing simply to see where the genre as we know it today came from.
This is one of the biggest gaming mascot-type characters to fail to make it as big as it should have. TheSly series is a great cartoon stealth platformer, which has now been re-released on PS3 in HD form (the original trilogy). Initially developed by Infamous developer Sucker Punch, the game is a cult classic and successfully merged 3D platforming with stealth elements.
In this game, which put you up against all sorts of deadly ghosts and spirits, you didn’t play as a soldier with guns, a police officer, or even an adult with a stick. You played as a young school girl armed only with the Camera Obscura. This was a magical camera that could exorcise spirits, and it was your only defense against the supernatural.
Project Edenwas a brilliant puzzler in the TR mold, only this time you had four different characters to control, each with their own unique skills. Team leader Carter could interrogate people and access high security doors, engineer Andre could repair machinery, Minoko was the hacker of the team, and Amber was a powerful cyborg, capable of surviving hostile environments.
The film’s focus on fear and mistrust was also used in the game to great effect, and characters could become infected, meaning Blake would have to find and enlist the services of other survivors. Eventually, Blake discovered the truth, and after battling an army of alien beasts using guns, flame throwers, and other methods, he located the alien ship and did battle with the big bad Thing itself, with the help of none other than MacReady.
Shadow of Romewas a game of two halves. Agrippa’s sections were all about brutal combat and action, and Octavianus’ sections involved stealth and puzzle solving, and the two disparate styles worked well together, breaking up the violence (which was pretty graphic) with some slower-paced stealthy section that also gave you the chance to explore famous areas of Rome.
Arriving on the market several years before Guitar Hero andRock Band, Gitaroo Man was a precursor of what was to come. It didn’t feature the same exact style of play as GH and RB, instead using onscreen controller prompts when in guard mode, but it did feature a unique guitar playing interface when the player had to strum to the music. Using the analog stick to follow the “trace line,” you had to keep the aiming cone on the line while pressing buttons to play music and “attack” your foe. The modes alternated as the song progressed, meaning players had to quickly change from attack to guard, and so on.
You infiltrated enemy bases, sabotaged supplies, and generally became a major thorn in the side of the invading army. All of this took place within an occupied New York. As you succeeded in your goals, you gained charisma. The more charisma you had, the more followers you could lead. You could tell these allies to follow, defend, and attack, which was simple squad commanding, but functional. Each chapter was made up of various missions, and your actions in one mission could affect events in another, with some actions weakening the Russian military presence in later missions.
You’ve heard of the Batman: Arkham series, right? Of course you have. We’re willing to bet you’ve not heard ofUrban Chaos: Riot Response, though. This is the debut game from Arkham creator Rocksteady Studios, and it’s one of the best, and most highly polished FPS titles on the PS2.
You played the role of Nick Mason, an officer in the ‘T-Zero’ riot response division of the police. Armed with your trusty riot shield, and a host of other weapons, your job was to take down criminals and gang members, often having to find and subdue a gang leader with a non-lethal attack, at the same time rescuing hostages.
Urban Chaos looked great for a PS2 FPS, and it featured some of the most satisfying gunplay around. Head shots in particular were gratifying (and often the best way to take out foes, so mastering it was important), and the riot shield opened up new game mechanics, such as having to slowly approach a hostage-holding gang member, shielding yourself from fire until you could get in that elusive headshot. Brilliant.
Based on the manga Dororo, Blood Will Tellwas a great game that features one of the craziest premises for a story we’ve seen. You’re Hyakkimaru, a man whose major organs and body parts were all stolen by demons at birth after his father, the land’s ruler at the time, made a deal with them in order to bring peace back to the land. Hyakkimaru was then abandoned by his father, and found by a man named Jyukai, who created artificial body parts and prosthetics to rebuild Hyakkimaru’s body. Eventually, Hyakkimaru heard a heavenly voice tell him that if he slew the fiends that took his body parts, he could regain them, and his humanity.
Armed with a deadly katana and twin blades concealed in his arms, as well as an arm-mounted machine gun and a leg-mounted bazooka, Hyakkimaru set out to find and defeat the 48 fiends, accompanied by his companion, the young thief, Dororo.
Blood Will Tell played very much like Devil May Cry, only with larger, more open areas and some stealth and puzzle sections (as Dororo). Hyakkimaru and his implanted weapons made for a great combat character, with all sorts of crazy moves and combos, which could be upgraded as you progressed. The levels were varied, and there was no cheating or shortcuts taken. You actually did seek out and kill 48 fiends, many of which were impressive bosses, and some were downright freaky. Each chapter of the game had its own mini-story, keeping things interesting. This was a brilliant fighter that really you should dig out.
If Disney and Pixar weren’t so against violence, The Mark of Kri is possibly what we may end up with. Behind the very Pixar-like aesthetics lies a violent, but well-crafted stealth adventure.
Rau Utu is a powerful warrior, who is helped by a bird called Kuzo, accepts a mission to investigate some local bandits, and is drawn into a bigger quest, with major repercussions.
The Mark of Kriwas primarily a stealth game, requiring careful use of Rau’s scout, Kuzo, and stealth tactics to take enemies out silently. The unique control system used both analog sticks, the left for movement and the right to sweep around the area with an aiming line, used to attack nearby foes. Rau also got a bow and special abilities, all of which were used tactically to achieve his objectives.
A music shooter, Rez is a trip for the eyes and the ears. It’s an on-rails shooter that ties the onscreen action and your success to the music. As you fight, you add music and sound effects to the soundtrack, and your onscreen avatar transforms. Everything in the game reacts to the beat of the music, and the Panzer Dragoon-style controls and impressive bosses all make for a short, but unforgettable shooting experience.
It’s crazy to think that a series as popular as Monster Hunter was once overlooked by most. The original Monster Hunter arrived on the PS2, and was promptly dismissed by all but those who had the time and patience to give it a real chance.
The hunting of the original game was accompanied by a complex gathering and crafting system, with every item farmed or carved off fallen beasts being used to make items, weapons, and armor. The game, thanks to numerous quests, many of which you needed to grind in order to find rare resources, is immense. It tried its best to make you dislike it with clunky controls and a dodgy camera, but this was one title where it was well worth persevering, just like the many sequels.
Clover Studios was one of Capcom’s most promising divisions before it was closed down. It was responsible for two of the best underrated games on the PS2, one of which was God Hand (see the next entry for the other).
You played as Gene, a fighter who lost his arm in a gang attack. Luckily, he was bestowed with a replacement, one of the two God Hands, magical arms used to combat demons. With this arm now a part of him, Gene walked the Western-themed world fighting all sorts of bonkers villains and demons with a range of over-the-top combat moves.
If you’re one of the people who stuck with it (which wasn’t many, apparently, hence its commercial failure and inclusion here), you found a great, challenging beat ’em up with style, personality, and some truly enjoyable gameplay. The game’s quality isn’t all that surprising, as Resident Evildesigner Shinji Mikami directed it.
It’s been called the PlayStation 2’s Zelda, but Okami is far more than a simple clone, and it’s undoubtedly one of the best games ever released on the console. Based on Japanese mythology, with a brilliant ink and paper art style, you played the role of Amaterasu, a goddess in the form of a white wolf with the ability to use the “celestial brush” to manipulate the world and create objects.
By drawing on the screen, you could create bombs, gusts of wind, make trees grow, and many other things, all with the aim of restoring life to the land, which was ravaged by the demon, Orochi.
Okami took masses of inspiration from Zelda, and played in a very similar manner, with a large, open world, dungeons, boss fights, and skills and items required to access various, otherwise sealed off areas. This was all delivered in a truly charming and beautiful manner, and it played brilliantly.
Okamiwas an epic and flawless adventure, and if there were any issues to be found, it was the lack of real difficulty. Still, with a long and varied story with tons of side quests, memorable characters, and all sorts of extras and mini games,Okami is unmissable, which makes it all the more upsetting that it was overlooked by most, contributing to the death of a very talented studio. Damn.
Yes, it had to be here. Ico is usually the first game anyone thinks of when asked about underappreciated PS2 games, and for good reason – it was both overlooked and bloody brilliant.
It’s a very challenging and often emotional journey of a game. It went through a period of being very rare, commanding high prices on eBay, but now it can be found in an HD double pack with Shadow of the Colossus, so is far easier to find, which is something you really should do.
Created by Michel Ancel, the game told the story of Jade, a photojournalist who looks after a group of orphans with her uncle, a humanoid pig called Pey’j. The world she lives in is called Hillys, and it’s invaded by an alien race known as the Domz. This race kidnaps Hillyans to use as energy sources or slaves.
Jade embarks on a mission to uncover the Hillyan military’s involvement with the alien threat, ultimately to stop the invaders and free the planet. She does this by infiltrating various facilities in order to acquire photographic evidence of the collaboration. Using a combination of stealth and combat with her staff to explore the world, Jade acquires various upgrades, for her and her vehicles, including her hovercraft. She is also accompanied by Pey’j and special operative Double H.
PlayStation 2 models were produced from 2000 to 2013. Some PlayStation 2 (PS2) revisions only change in their internal construction while others feature substantial external changes. Each region receives a different model number; for example, the V18 was released in North America as SCPH-90001, in Australia as SCPH-90002, and in Hong Kong as SCPH-90006. The final digit is a region code with no bearing on the hardware; many games and DVDs are restricted to certain regions, and the system software displays in different languages.
The PS2 is primarily differentiated between models with the original "fat" case design and "slimline" models introduced at the end of 2004. In 2010, a television incorporating a PS2 was introduced.
Three of the original PS2 launch models (SCPH-10000, SCPH-15000, and SCPH-18000) were only sold in Japan and lacked the expansion bay of later PS2 models. These models instead included a PCMCIA slot. SCPH-10000 and SCPH-15000 did not have built-in DVD movie playback and instead relied on encrypted playback software that was copied to a memory card from an included CD-ROM (normally, the PS2 will only execute encrypted software from its memory card; see PS2 Independence Exploit). V3 had a substantially different internal structure from the subsequent revisions, featuring several interconnected printed circuit boards. In V4, everything except the power supply was unified onto one board. V5 introduced minor internal changes, and the only difference between V6 (sometimes called V5.1) and V5 is the orientation of the Power/Reset switch board connector, which was reversed to prevent the use of no-solder modchips. V5 also introduced a more reliable laser than the ones used in previous models. V7 and V8 included only minor revisions to V6.
Beginning with model SCPH-500xx (v9 & 10), the i.LINK port was removed. An infrared receiver was added for use with a remote to control DVD playback, leaving both controller ports free from the external receiver.
The PS2 standard color is matte black. Several different variations in color were produced in different quantities and regions, including ceramic white, light yellow, metallic blue (aqua), metallic silver, navy (star blue), opaque blue (astral blue), opaque black (midnight black), pearl white, Sakura purple, satin gold, satin silver, snow white, super red, transparent blue (ocean blue), and also Limited Edition color Pink, which was distributed in regions including Oceania and parts of Asia.
In September 2004, Sony unveiled its third major hardware revision (V12, model number SCPH-700xx). Available in late October 2004, it is smaller, thinner, and quieter than the older versions and includes a built-in Ethernet port (in some markets it also has an integrated modem). Due to its thinner profile, it does not contain the 3.5" expansion bay and therefore does not support the internal hard disk drive. It also lacks an internal power supply, similar to the GameCube, and has a modified Multitap expansion. The removal of the expansion bay results in incompatibility with games that require the HDD expansion, such as
There are two sub-versions of the SCPH-700xx,Emotion Engine (EE) and Graphics Synthesizer (GS) chips, and the other with the newer unified EE+GS chip, but otherwise are identical. The sub-versions are variously referred to as V12 for both models, V11.5 for the older and V12 for the newer model, and V12 for the older and V13 for the newer model.GCC countries, France, Italy, South Africa, and North America. A limited edition pink console also became available after March 2007.
V12 (or V13) was succeeded by V14 (SCPH-7500x), which contains different ASICs than previous revisions, with some chips having a copyright date of 2005, compared to 2000 or 2001 for earlier models. It also has a different lens and some compatibility issues with a different number of PlayStation games and even some PS2 games.HDPro was created but had limited success.
Sony also manufactured a consumer device called the PSX that can be used as a digital video recorder and DVD burner in addition to playing PS2 games. The device was released in Japan on December 13, 2003, and was the first Sony product to include the XrossMediaBar interface. It did not sell well in Japan, and was not released anywhere else.
Released in Europe in 2010, the Sony BRAVIA KDL22PX300 is a 22-inch (56 cm) 720p television that incorporates a PlayStation 2 console with a disc cover that can be opened by pressing the "Open" button and closed by sliding it and four HDMI ports. The TV also includes BRAVIA Internet Video Access, allowing users access to streaming services such as YouTube and on-demand television.1080p input sources can be used and displayed.
Calvert, Justin; Niizumi, Hirohiko (November 4, 2003). "PS2 price drop, new colors for Japan". the original on September 21, 2013. Retrieved July 10, 2007.
Smith, Tony (September 13, 2005). "Sony recalls "millions" of PS2 power adaptors". Archived from the original on May 29, 2006. Retrieved April 6, 2010.
Selected games on Sony"s video game console offer online capabilities. Games that enable the feature provide free online play through the use of a broadband internet connection and a PlayStation 2 Network Adaptor. Since the service has no official name, it is sometimes referred as either PS2 Network Play, PS2 Network Gaming or PS2 Online.
The service was launched in July 2001 in Japan, August 2002 in North America,slimline" models, a network adapter is integrated into the hardware. Some games also allowed online gameplay using a dial-up connection (not available on all models), or LAN play by connecting two network adapters/slimline consoles together directly with an Ethernet cable or through the same router network.
Instead of having a unified online service like SegaNet or Xbox Live, online multiplayer on the PS2 was the responsibility of the game publisher and was run on third-party servers. However, later PS2 online games required the console to be authorized through Sony"s Dynamic Network Authentication System (DNAS) before connecting to the server. Unofficial servers also exist which could be connected by setting up the DNS settings to connect to an unofficial DNS server. Most recent PS2 online games have been developed to exclusively support broadband internet access.
Playing online games requires that users set up the system"s network connection configuration, which is saved to a memory card. This can be done with the Network Startup Disk that came with the network adapter or using one of the many games that had the utility built into them, such as Resident Evil Outbreak, to set up the network settings. The new slimline PlayStation 2 came with a disk in the box by default. The last version of the disk was Network Startup Disk 5.0, which was included with the newer SCPH-90004 model released in 2009.
PAL games that supported online gaming display a WITH NET PLAY logo on their cover. North American games feature an "Online" icon in the lower right corner of the cover; on games that do not support dial-up connectivity, "broadband only" is also found on the logo.
Over time, most game servers have been shut down. However, computer programs such as XBSlink, SVDL and XLink Kai allow users to achieve online play for some PS2 games by using a network configuration that simulates a worldwide LAN.
The main core is a MIPS R5900-compatible CPU with lots of enhancements. This is the first chip that starts executing instructions after the console is turned on. The processor provides the following features:MIPS III ISA: A 64-bit RISC instruction set. Wait, is it me or this is the same ISA found on a competitor’s console?. Not quite, Sony enhanced the ISA by adding some instructions from MIPS IV (prefetch and conditional move) along with their own SIMD extension called multimedia instructions.
The core is complemented with a dedicated floating point unit (identified as ‘COP1’) that accelerates operations with 32-bit floating-point numbers (also known as floats in C). This is a peculiar block as it doesn’t follow the IEEE 754 standard, most evident with its absence of infinity (computed as 0 instead)
Whether we want it or not, with the amount of traffic happening inside the Emotion Engine, this design will eventually suffer the consequences of the Unified memory architecture or ‘UMA’. That is… multiple independent components trying to access main memory at the same time, causing congestion. Well, to correct these issues, Sony alleviated the constant need for memory by:Wrapping their processors with lots of cache. Thus, only requiring access to main memory if it’s absolutely necessary.99% of cache/scratchpad mentions in this article will be for this reason.
Inside the same Emotion Engine package, there is yet-another processor called Image Processing Unit or ‘IPU’, this time designed for image decompression. As the successor of the MDEC, the IPU can be useful when a game needs to decode an MPEG2 movie without jamming the main CPU.
Long story short, the game sends compressed image streams to the IPU (hopefully using DMA) which are then decoded in a format that the GPU can display. The PS2’s operating system also relies on the IPU to provide DVD playback.
It’s been two years since the rivals presented their latest offering. If you read the former article and just started reading this one, I presume you are still waiting for ‘the thing’ that makes the PS2 as powerful as it seemed back then. Now, let me introduce a very important set of components Sony fitted in the Emotion Engine, the Vector Processing Units or ‘VPU’.
The second VPU found, the VPU1, is an enhanced version of the VPU0 with quadruple the amount of micro memory and VU memory. Moreover, this unit includes an additional component called Elementary function unit or ‘EFU’ which speeds up the execution of exponential and trigonometric functions.
The VPU1 is located between the VPU0 and the Graphics Interface (the ‘gate’ to the GPU), so it includes additional buses to feed the geometry to the GPU as quickly as possible and without using the main bus.
A useful approach that can be exploited with these units is procedural generation. In other words, instead of building the scene using hard-coded geometry, let the VPUs generate it using algorithms. In this case, the VPU computes mathematical functions to produce the geometry which is then interpreted by the GPU (i.e. triangles, lines, quadrangles, etc) and ultimately used to draw the scene.
On the other side, procedural content may struggle with animations and, if the algorithm is too complex, the VPU might not generate the geometry at the required time.
With these new additions, programmers now have a lot of flexibility to design their graphics engines. To assist with this, Sony spent additional resources to devise and document efficient pipeline designs. The following are examples of graphics pipelines optimised for different types of workloads
In the first example, the Parallel design, the CPU is combined with the VPU0 in macromode to produce geometry in parallel with the VPU1. The CPU/VPU0 group makes full utilisation of scratchpad and cache to avoid using the main bus, which the VPU1 relies on to fetch data from main memory. In the end, both rendering groups concurrently send their respective Display Lists to the GPU.
These have been so far examples from the theoretical point of view, but to explain a more ‘practical’ implementation, I’m going to refer to a video Jon Burton published regarding the development of one of their PS2 games
The former director of Travellers Tales explained how his team achieved a particle system fully encapsulated within the VPU1. In a nutshell, the VPU1 focused on reading a pre-populated database from its VU memory, the database was used to calculate the coordinates of particles at any given time without depending on any other component. The result of the operation could be transformed into Display Lists and sent right away.
The Emotion Engine kickstarts the Graphics Synthesizer by filling its embedded DRAM with the required materials (Texture bitmaps and Colour Lookup tables, the latter are also known as ‘CLUT’), assigning values on the GS’s registers to configure it, and finally, issuing the drawing commands (Display Lists) which instruct the GS to draw primitives (points, lines, triangles, sprites, etc) at specific locations of the screen.
Using the previous values calculated, the renderer generates pixels from the primitives. This unit can generate 8 pixels (with textures) or 16 pixels (without textures) concurrently, each pixel entry contains the following properties calculated:RGBA: Corresponding to the gradient of Red, Green, Blue and Alpha (transparency).
Here certain pixels will be discarded if they don’t meet several requirements. Having said that, the following tests are carried out:Alpha test: Compares the alpha value (transparency) of a pixel against the ‘standard’ value. This is because in some cases, the alpha value is required to be within a certain range or greater/less than an arbitrary value.
The last stage can apply some effects over our new pixels using the previous frame-buffer found in local DRAM:Alpha Blending: Merges colours of the current buffer with the previous one in memory.
Colour Clamping: After applying operations like Alpha Blending, the new RGB value may exceed the valid range (0-255), so clamping sets the value within the range.
It’s worth mentioning that games like Dragon Quest implemented a custom lighting model called Cel Shading (a term I have mentioned before), however, in my previous articles I explained that the GPU was mainly responsible for this. In the PS2 case, the required colour calculations are presumably done by the Emotion Engine, since the GS isn’t as flexible as other GPUs.
As stated before, the PCRTC sends the frame buffer through the video signal. The interface can broadcast video using a wide range of formats (to work with TVs from any geographical region)PAL: Sends up to 640x512 pixels at 50 Hz, either progressive (576p) or interlaced (576i).No games found in the market use 576p. While some support progressive mode, they do so in 480p mode.
Does that mean the PS2 can ‘display HD’? Technically… yes, but I don’t think most game studios risked the performance penalty for a format that wasn’t popularised yet.
The video-out port (Multi A/V) is very convenient. It carries RGB, Component, S-Video and composite. So, all the important signals are there without requiring proprietary adapters or internal modifications.
Curiously enough, these are still two independent processors and to configure them, you have to alter their registers. However, Sony warned developers that both sets of registers have to be set with 1/48000 seconds of gap. If you hurry too much, the behaviour of the SPU2 becomes unpredictable!
Finally, the chip can mix all channels to provide stereo output. Now, here is the interesting part: The SPU2 can feed itself the mixed stereo sample as new input, this enables the EE to access it (to mix it with even more audio, for instance), or keep adding more effects (such as reverb, echo and delay).
To start with, there’s a dedicated processor that arbitrates the communication between different components, this CPU is no other than the original MIPS R3000-based core found in the PlayStation 1. This time, it’s called IOP and runs at 37.5 MHz using a 32-bit bus
The IOP communicates with the Emotion Engine using a specialised I/O interface called System Interface or ‘SIF’, both endpoints use their DMA units to transfer data between each other. The IOP also contains its own memory used as a buffer.
In later revisions of this console, the IOP was replaced with a PowerPC 401 ‘Deckard’ and 4 MB of SDRAM (2 MB more than before), backwards compatibility persisted but through software instead.
What about the ‘experimental’ ones? To start with, there was initially a front i.Link port (also known as IEEE 1394, or ‘FireWire’ in the Apple world). This port was used to connect two PS2 to enable local multiplayer, but it was removed after the third revision (presumably replaced by the ‘Network card’, more details below).
Next to the controller slot is the Memory Card slot which is compatible with PS1 and PS2 cards. The new cards embed extra circuitry for security purposes referred to as MagicGate, which enables games to block data transfers between different memory cards.
There’s a 4 MB ROM chip fitted on the motherboard that stores a great amount of code used to load a shell menu that the users can interact with, but it also provides system calls to simplify I/O access
Start the IOP processor and send it modules, these will enable the IOP to handle the hardware of this console. In the end, the IOP will be put in a ‘waiting for command’ state.The use of modules allows Sony to release new hardware revisions of the PS2 without changing the IOP, lowering production costs.
It is unprecedented the level of popularity this system achieved during the noughties, at the end of its lifespan (2013, after 13 years!) the game library was filled with 1850 titles
What happened here is really impressive. The PS2 doesn’t have a ‘programmer-friendly’ architecture, (as seen from the perspective of a PC programmer) yet with such an amount of games developed, I too wonder if there were more factors involved (such as ‘licensing reliefs’, low distribution costs, cost of development, small form factor and so on).
On the software side, there was the PlayStation 2 SDK which includedThe Emotion Engine toolchain: A set of C and C++ compilers, assemblers, linkers and debuggers used to control each element of the EE. The main CPU was mainly programmed using C/C++, however, performance-critical components like the vector units were programmed using assembly (microcode/macrocode) instead.The pack also included an ‘Emotion Engine simulator’ which could roughly test out the code without sending it to the real hardware, although the simulator wasn’t as accurate as the physical EE chip.
On the hardware side, Sony provided studios with dedicated hardware to run and debug games in-house. The initial devkits were bare boards stacked together to replicate the un-released hardware of the PS2. Later kits (named Development Tool), had a more presentable appearance, enhanced I/O and combined workstation hardware (running RedHat 5.2) with PS2 hardware to build and deploy the game in the same case
Due to the type of medium used, not only games could be played, but also movies. Now, this requires a decoder to be able to read the DVD movie format, and for that, the PS2 initially included the required bits installed in the memory card (after all, the card is just a storage medium) but later models came with the DVD software pre-installed in the BIOS ROM.
Apart from all these games with their fancy graphics, Sony released a Linux distribution based on ‘Kondara’ (which is in turn based on Red Hat 6) available in two DVDs (first disc called ‘Runtime Environment’ and the second one called ‘Software Packages’) along with a VGA adapter, USB Keyboard and Mouse; plus some developer manuals. The pack was known as Linux Kit and you could run the OS by booting the first DVD and then proceed like any old school Linux environment. You obviously needed a Hard drive fitted in the console to install the Linux distro. Once installed, the first DVD was always required to boot this OS.
Linux Kit included compilers targeting the EE (gcc 2.95.2 with glibc 2.2.2) and assemblers targeting the vector units, along with a window system (XFree86 3.3.6) ‘accelerated’ in the Graphics Synthesizer
As any other console of its generation (and previous ones) using disc-based systems, it was a matter of time before third-party companies reversed-engineered the DVD subsystem. The goal here was to find a usable exploit which could force the driver to navigate through the file system without needing an out-of-reach map file.
Along with the modchips, which required soldering skills to install, unauthorised but ‘genuine’ discs appeared in the market. These enabled to defeat the region protection and use in-game cheats by patching the OS. Moreover, ‘cheat discs’ had the advantage of not requiring to modify the console. I guess the best example to mention is CodeBreaker.
This doesn’t necessarily require altering the console. However, depending on the model, the external case of the PS2 will have to be tampered with to block the eject sensors of the drive. In some cases, placing cotton in certain places will do the trick.
The PS2 stores a database file called TITLE.DB in MemoryCard which contains information used to optimise the emulation of PS1 games strike one). The information parser is implemented using strncpy(), a function in C that copies strings (chain of characters) from one place to another.
For people familiar with C, you probably guessed where I’m going. The thing is that strncpy() doesn’t know how long is a string, so unless it’s terminated (by placing \0 at the end of the chain) the copy goes on ‘forever’ (with unpredictable results!). Luckily, this function contains an optional parameter that specifies the maximum number of bytes to be copied, protecting the copy from buffer overflows. As ludicrous as it may seem, Sony didn’t use this parameter, even though each database entry has a fixed size of 256 bytes (strike two).
Upon closer inspection in RAM, TITLE.DB happens to be copied next to a saved register, $ra, which states the address to return after the current function being executed finishes (strike three) leading up to The independence exploit: Craft a Title.db with a large string, embed an executable in it and design that string so $ra will be overridden to point to the executable. If you manage to upload that file to your MemoryCard (through another exploit or a PC USB adapter) you got yourself a simple Homebrew launcher.
Some time ago, it was discovered that the BIOS of this console could be upgraded using the Memory Card, this function was never used in practice, but neither was removed (at least during most of the console’s lifespan). With this, hackers realised that if they find a way to install particular software into the MemoryCard, then the BIOS would always load it at boot. This discovery led to Free MCBoot, a program presented as ‘upgrade data’ which replaces the original shell with one that can execute Homebrew.
Bear in mind these changes are not permanent, a Memory Card with ‘Free MCBoot’ installed must be inserted prior to the console’s startup. Additionally, this software needs to be installed somehow, so another exploit (e.g. disc swapping) is required to bootstrap the installer.
The same year after the release of Free MCBoot, another trick was discovered: Disguising games as DVD movies, effectively allowing unauthorised game copies to be read without requiring a modchip.
This only required patching the game image by adding dummy metadata and partitions only used by DVD movies. Then, when the burned copy is inserted into the console, the drive won’t reject it, but it won’t execute the game either. However, with the help of a Homebrew program called ESR, the game could be kickstarted.