ibm t220 t221 lcd monitors free sample
The IBM T220 and T221 are LCD monitors that were sold between 2001 and 2005, with a native resolution of 3840×2400 pixels (WQUXGA) on a screen with a diagonal of 22.2 inches (564 mm). This works out to 9,216,000 pixels, with a pixel density of 204 pixels per inch (80 dpcm, 0.1245 mm pixel pitch), much higher than contemporary computer monitors (about 100 pixels per inch) and approaching the resolution of print media. The display family was nicknamed "Big Bertha" in some trade journals. Costing around $8,400 in 2003, the displays saw few buyers. Such high-resolution displays would remain niche products for nearly a decade until modern high-dpi displays such as Apple"s Retina display line saw more-widespread adoption.
The IBM T220 was introduced in June 2001 and was the first monitor to natively support a resolution of 3840×2400.LFH-60 connectors. A pair of cables supplied with the monitor attaches to the connectors and splits into two single-link DVI connectors each, for a total of four DVI channels. One, two or four of the connectors may be used at once.
IBM T220 comes with a Matrox G200 MMS video card and two power supplies. To achieve native resolution the screen is sectioned into four columns of 960×2400 pixels or four tiles of 1920x1200 pixels. The monitor"s native refresh rate is 41 Hz.
This is a revised model of the original T220. Notable improvements include using only one power adapter instead of two and support for more screen modes. However, power consumption increased from 111 to 135 watts (111 to 150 at maximum.) They were initially available as 9503-DG1 and 9503-DG3 models. The 9503-DG1 model came with a Matrox G200 MMS graphics card and two LFH-60 connector cables. The 9503-DG3 model came with one cable connecting from one or two DVI ports on the graphics card to the T221"s LFH-60 sockets.
IBM T221 started out as an experimental technology from the flat panel display group at IBM Thomas J. Watson Research Center. In 2000, a prototype 22.2 in TFTLCD, code-named "Bertha", was made in a joint effort between IBM Research and IBM Japan. This display had a pixel format of 3840×2400 (QUXGA-W) with 204 ppi. On 10 November 2000, IBM announced the shipment of the prototype monitors to U.S. Department of Energy"s Lawrence Livermore National Laboratory in California. Later on 27 June 2001, IBM announced the production version of the monitor, known as T220. Later in November 2001, IBM announced its replacement, IBM T221. On 19 March 2002, IBM announced lowering the price of IBM T221 from US$17,999 to US$8,399. Later on 2 September 2003, IBM announced the availability of the 9503-DG5 model.
IBM and Chi Mei Group of Taiwan formed a joint venture called IDTechViewSoniciiyamaOEMed the T221 and sold it under their brand names. The production line of IDTech at Yasu Technologies was sold to Sony in 2005
Novaković, Nebojša (2003-03-28). "IBM T221 - the world"s finest monitor?". The Inquirer. Archived from the original on September 14, 2009. Retrieved 2011-12-23.link)
The IBM T220 first debuted in 2001 and was the first (and apparently still last as of 2014) to feature "Wide Quad Ultra Extended Graphics Array" (WQUXGA). The later enahnced (and more well known) T221 came out in 2003 and was discontinued shortly thereafter in 2005. It was produced as a high density IPS monitor for high-resolution needs: invariably it showed up mostly in specialized professions and not purchased by many consumers due to the high price. It seems that it was pitched to be [intentionally] paired alongside an IntelliStation.
NOTE: As of October 16th 2014, Apple has released a 5K Retina iMac featuring a display of 5120 x 2880 -- officially being the first on beating the T221; it has 5.3 million more pixels. It took 13 years for this resolution to be surpassed.
One unique aspect pertaining to these monitors is that the intelligence regarding the resolution scaling is pixel perfect: meaning no gaussian blurs are required; unless the resolution is not "even" disallowing proper pixel division. They are capable of scaling down properly to as low as 640x480.
It"s worth noting that due to the fact the T221s can use a maximum of 150 watts (primarily if the monitor is on maximum brightness), they have two cooling fans. Fortunately they"re not very loud, but it"s important that they are replaced/repaired if they fail.
I"ve noticed (even with a brand new T221) that they inject a noisy ground loop into the power grid through the ground. This seems to affect all audio devices on the same power circuit. The manual does explicitly state you need a properly grounded connection which makes the problem seem as if this was known. I opened up the external power brick and Chemicon electrolytics were used: so the issue isn"t due to the quality of the components but something with the design of the power supply.
Even though the design was conceived at the T. J. W. research center, much of the later implementation and manufacturing was sent to IBM Japan in coordination with IBM"s (at the time) IDtech. As such, most warranties found with T221s are for Japan as many were later re-sold in North America. The warranty card has some unusual design features including repetition of the stripeless IBM logo:
NOTICE: because nVidia has restricted their advanced multi-monitoring software (mosaic) to Quadro cards, you must purchase a Quadro that has mosaic capabilities. nVidia cards (such as the GTX) will not work as they are restricted to nvidia surround; and nvidia surround does not work with the T221.
The card of choice I went with was an nVidia NVS 510 (because it has four inputs and can operate well with my System x 3300 M4; this particular system x refuses to work properly with AMD GPUs of any flavour). There are two ways to approach configuring the T221, the first one is more optimal, the second one less so. But before that is explain, here are the common resolutions you will be working with when setting up "virtual monitors" to the T221 (as the T221 obviously can"t receive a video signal from one connector, the controller divides the panel up into either four or two "virtual monitors").
Load up the NVIDIA Control Panel, go to Display > Change resolution. Go through each one of the "IBM Digital Displays" and set them to 1920x1200, NVIDIA will default the values to 41 Hz. Then go to: Workstation > Set up Mosaic > Create new configuration. Select "4" for total displays from the dropdown, and choose the 2x2 topology.
After you enable Mosaic (feel free to power off the T221 for 8 seconds as the nVidia card will flicker through a bunch of garbage resolutions causing the T221 to power off and on rapidly). Once that"s done, Mosaic will stupidly see it as one display at 3840x2400 and will assign it to 13 Hz. Go *back* a screen and set all of the IBM virtual displays in mosaic to 1920x1200 @ 41 Hz (before you hit Apply feel free to power off the T221 for 8 seconds again). Once that"s done just go through the rest of the menus in mosaic and click finish.
Credit for the hex value goes to "wildpig" from here: https://forums.geforce.com/default/topic/788053/geforce-900-series/gtx-980-4-monitor-surround-and-the-ibm-t221/
Load up the NVIDIA Control Panel, go to Display > Change resolution. Create a new custom resolution (in this case 920x2400 @ 48Hz or 1920x1200 @ 48Hz). Assign all of the virtual monitors the custom resolution once finished. Then go back to: Workstation > Set up Mosaic > Create new configuration. Select "4" for total displays from the dropdown, and choose the 2x2 topology if you"re using 1920x1200, or 1x4 for 920x2400. Basically the same process as last time (Mosaic will see it as a single 3840x2400 monitor @ 13 Hz, so you need to go back, reassign the virtual monitors, and then complete the setup).
Didn"t Apple find a way around this with their 30" LCD running through ADC ports?Apple"s 30" (and just about every other 30") is only 2560x1600, substantially lower in resolution than the 3840x2400 of the T221.
The monitor was originally created for engineers to use. I believe Los Alamos was one of IBM"s first customers for systems using the T220/T221. The low refresh rate and contrast aren"t a big deal for this monitor"s typical uses.
You have a bear. We"re still using XP to drive these monitors, and as soon as we can free one up to test it with 7 we will, but the thinking is that we are SOL. The new ATI videocards are promising, but whether or not they will drive these correctly is up in the air. I haven"t done any googling, though. It"s almost like the "HD revolution" is causing everything to take a step back. At this point in time, it"s just WAY easier to stick with 2k monitors. But, good luck.
I"ve used my T220 (VP2290b to be specific) on XP for several years without issue. The monitor supports four DVI plugs, but will function on a single DVI connection just fine, but only at 13hz. While this sounds horrid, it"s usable for everything but video (and even most youtube clips are watchable). So if you find yourself wanting 9.2 MP for Photoshop, programming, CAD, etc - it"s entirely usable. There is a SLIGHT "skip" effect with the mouse, but it"s not very noticeable, and there is no flicker. I even plugged the screen into a friend"s Macbook pro, and it ran fine without issue at full resolution. To get above 13Hz, you need to use more plugs, and there are various strategies for doing this. DVI has no "maximum resolution" - just finite bandwidth and you must trade resolution for refresh rate. Running the monitor at 1920x1200 will enable full refresh rate with one connector, if the need arises (Watching a DVD). The "converter box" is supported by newer versions of this screen, and it separates each link in a dual-link DVI, and routes it to each one of the connectors (This monitor was made before dual-link DVI). It"s not officially compatible with the older screens and I"ve never tried. Without the converter box, you use multiple connectors to drive different parts of the screen, which is well documented.
I"d say Windows 7 is probably the least T220-friendly OS (vs XP and Mac), but it definitely works with adequate hardware and some "tweaking" - and there"s no need to spend big money on a professional level videocard or anything "rated" for 3840x2400. YMMV, but every videocard I"ve tried (about 10) supports this resolution in some capacity in XP.
This monitor is not for the faint of heart. I have 20x18 vision and can read it easily from 12-18" away at normal DPI, but several of my friend with even slightly impaired vision find it very difficult to read. I can"t imagine getting work done without this monitor - you can view EIGHT documents in "page view" in word at the same time at 100% zoom. However it"s beyond useless for gaming, which drives a large amount of modern PC purchases. Also, the technology to make this monitor has probably gone down in price, but it was $22000 USD in 2001, and $8500 USD in 2004. Even if it fell to $1500, you can buy a lot of "consumer" monitors for that money. I"m sure that has a lot to do with demand. On the contrast ratio / general image quality: The measurement system for contrast ratios is anything but standard, and there is a large amount of embellishment with these figures, especially some of these "10000:1" numbers that float around. Whatever the number is, 300 or 400 to 1, it"s very conservative. I"ve seen monitors that were visually superior to this, but only slightly. I own a sharp aquos and a dell ultrasharp (IPS), and the T220 is very competitive. When you factor in it"s 9.2 MP resolution, it"s a no-brainer for photo editing. In closing, for those who haven"t seen a 220DPI monitor, the pixels are just small enough for the eye to resolve the individual pixels - but NOT the subpixels. It"s absolutely amazing to get 4" away from the monitor and just see more detail (like with paper), and not a bunch of RGB elements. This means there are almost no aliasing effects.
Holy sweet mother of Christ. That thing has twice the resolution of my two 24" LCDs combined, and all shoehorned into a little 22" panel. That"d be a bit small for my tastes, but I"d love to see that on a 30" LCD myself. I"d definitely make room on my desk for something like that.
Holy sweet mother of Christ. That thing has twice the resolution of my two 24" LCDs combined, and all shoehorned into a little 22" panel. That"d be a bit small for my tastes, but I"d love to see that on a 30" LCD myself. I"d definitely make room on my desk for something like that.
Anyway didn"t mean to brag. Hopefully the OP / anyone else with T220 questions will benefit from this, and I"m glad so many people are impressed - hopefully companies will make monitors with higher resolutions (I love how to be labeled "1080p" many desktop monitors have robbed us of 120 vertical pixels, going from 1900x1200 to 1900x1080).
Judging by that calculation, a standard 24" 1920x1200 LCD (like the pair I have) is ~93DPI. Presumably the T221 is 221DPI. I wouldn"t mind a panel somewhere in the middle of that range, say ~150DPI or so (about the DPI of cell phone screens, it seems). For a 24" 16:10 panel, that"s around about 3075x1912. That in itself would be quite the upgrade. Incompatible as hell with any sort of gaming, but interesting nonetheless.
This is a revised model of the original T220. Notable improvements include using only one power adapter instead of two and support for more screen modes. However, power consumption increased from 111 to 135 watts (111 to 150 at maximum.) They were initially available as 9503-DG1 and 9503-DG3 models. The 9503-DG1 model came with a Matrox G200 MMS graphics card and two LFH-60 connector cables. The 9503-DG3 model came with one cable connecting from one or two DVI ports on the graphics card to the T221"s LFH-60 sockets.
These monitors use IPS panels, run upwards of 5MP, and have better color depth than consumer-grade monitors. Often times they are grayscale (most of these images are not colored).
Does the FDA allow them to be used for diagnosis, or do they have to "technically" make the diagnosis after looking at one of those monitors you linked (or similar from another company)?
Plus, even the final revision of the T221 required being driven as either two monitors using one dual-link DVI and one single-link DVI or as four monitors by using four single DVI links. Even then, the internal refresh rate was a slightly annoying 48Hz.
Also keep in mind that it wasn"t until extremely recently (last few years) that LCD priced have dropped liked crazy. People back in 2003 were use to paying these kind of prices.
Maybe just one to start out with. :) I can rationalize such caution against my desire to splurge thus: with any luck, this will spark a little bit of necessary competition in desktop displays; an area stagnant since 2005"s introduction of 30" 2560x1600 monitors. Better to have not invested heavily in the initial offering.
On the other hand, I"ve been dealing with four fluorescent backlit 30" monitors for years. LED backlit would be an upgrade even with an edge-lit design.
It makes no sense why this display still has less resolution than some ThinkCentre monitors from the early 00"s. It makes no sense why there"s still that crap 1366x768 being sold now.
I organize my workspaces vertically, and I have 3x 16:10 monitors split into (roughly) 6 vertical workspaces. I only have two physical seams. If monitors were not horizontal, I would have 6 displays with 5 vertical seams.
I haven"t had the same experience on any flat panel display I"ve used till now but I"m on the lookout for a good 120Hz LCD gaming monitor in the hope I get the same experience again.
A number of common resolutions have been used with computers descended from the original IBM PC. Some of these are now supported by other families of personal computers. These are de facto standards, usually originated by one manufacturer and reverse-engineered by others, though the VESA group has co-ordinated the efforts of several leading video display adapter manufacturers. Video standards associated with IBM-PC-descended personal computers are shown in the diagram and table below, alongside those of early Macintosh and other makes for comparison. (From the early 1990s onwards, most manufacturers moved over to PC display standards thanks to widely available and affordable hardware).
unnamedunnamedA common size for LCDs manufactured for small consumer electronics and mobile phones, typically in a 1.7" to 1.9" diagonal size. This LCD is often used in the portrait (128×160) orientation. The unusual 5:4 aspect ratio makes the display slightly different from the QQVGA dimensions.160×128 (20k)5:4
QVGAQuarter Video Graphics ArrayHalf the resolution in each dimension as standard VGA. A retronym for CGA "medium" and EGA/MCGA/VGA "low" pixel resolution, normally used when describing screens on portable devices (pocket media players, cellular phones, PDAs etc.). No set colour depth or refresh rate is associated with this standard or those that follow, as it is dependent both on the manufacturing quality of the screen and the capabilities of the attached display driver hardware, and almost always incorporates an LCD panel with no visible line-scanning. However, it would typically be in the 8-to-12 bpp (256 to 4096 colour) through 18 bpp (262,144 colour) range.320×240 (75k)4:3
ST ColourAtari ST (etc.) Colour, Broadcast-standardAtari ST line. Colour modes using NTSC or PAL-compliant televisions and monochrome, composite video or RGB-component monitors.640×200, 320×2004:3 (or 16:10 with square pixels)2~4 bpp for ST, 8~15 bpp on later models (TT, Falcon).
ST MonoAtari ST (etc.) Monochrome, proprietary standardAtari ST line. Hi-res monochrome mode using a custom non-interlaced monitor, possibly derived from monochrome VGA, with the slightly lower vertical resolution (imposed by limited video memory) allowing a higher, "flicker free" 70 Hz refresh rate. Later machines in the series could also use colour VGA monitors.640×4004:3 (or 16:10 with square pixels)1 bpp for ST, 4~6 bpp greyscale on later models (TT, Falcon), plus 8 bpp colour on VGA monitors.
Video monitor I/NIFull-broadcast resolution video monitor or televisionCommodore Amiga line and others (e.g. Acorn Archimedes, later Atari models (TT, Falcon). They used NTSC or PAL-compliant televisions and monochrome, composite video or RGB-component monitors. The interlaced (I) mode produced visible flickering of finer details, eventually fixable by use of scan doubler devices and VGA monitors.720×480i/576i maximum. Typically 640×400i/512i or 640×200/256 NI, and 320×200/256 NI for games.4:3 (non-square pixels)Up to 6 bpp for Amiga (8 bpp with later models), typically 2~4 bpp for most hi-res applications (saving memory and processing time), 4~5 bpp for games and "fake" 12/18 bpp for static images (HAM mode). Up to 15 bpp for Archimedes and Falcon (12 bpp for TT), but typically 4 bpp in use.
Mac ColourApple Mac II and later modelsThe second generation Macintosh, launched in 1987, came with colour (and greyscale) capability as standard, at two levels depending on monitor size - 512×384 pixel (one-quarter of the later XGA standard) on a 12" (4:3) colour or greyscale ("monochrome") monitor, 640×480 with a larger (13" or 14") high resolution monitor (superficially similar to VGA but at a higher 67 Hz refresh rate) - with 8-bit colour/256 grey shades at the lower resolution, and either 4 or 8 bit colour (16/256 grey) in high resolution depending on installed memory (256 or 512 kB), all out of a full 24-bit master palette. The result was equivalent to VGA or even PGC - but with a wide palette - at a point simultaneous with the IBM launch of VGA.
Later, larger monitors (15" and 16") allowed use of an SVGA-a-like, binary-half-megapixel 832×624 resolution (at 75 Hz) that was eventually used as the default setting for the original, late 90s iMac. Even larger 17" and 19" monitors could attain higher resolutions still, when connected to a suitably capable computer, but apart from the 1152×870 "XGA+" mode discussed further below, Mac resolutions beyond 832×624 tended to fall into line with PC standards, using what were essentially rebadged PC monitors with a different cable connection. Mac models after the II (Power Mac, Quadra, etc.) also allowed at first 16-bit High Colour (65536 or "Thousands of" colours) and then 24-bit True Colour (16.7m or "Millions of" colours), but much like PC standards beyond XGA, the increase in colour depth past 8 bpp was not strictly tied to changing resolution standards.512×384 (197k), 640×480 (307k), 832×624 (519k)4:34 bpp, 8 bpp, and later 16/24 bpp
Powerbook internal panelApple Powerbook, early generationsThe first Powerbook, in 1991, replaced the original Mac Portable (basically an original Mac with an LCD, keyboard and trackball in a lunchbox-style shell) and introduced a new 640×400 greyscale screen. This was joined in 1993 with the "165c" model, which kept the same resolution but added colour capability similar to that of the Mac II (256 colours from a palette of 16.7 million).640×400 (256k)16:10 / 8:5 (square pixels)8 bpp
MDAMonochrome Display AdapterThe original standard on IBM PCs and IBM PC XTs with 4 kB video RAM. Introduced in 1981 by IBM. Supports text mode only.720×350 (text)72:35 (effectively 4:3 (non-square pixels) on CRTs but could be a variety of aspects on LCDs)1 bpp
CGAColor Graphics AdapterIntroduced in 1981 by IBM, as the first color display standard for the IBM PC. The standard CGA graphics cards were equipped with 16 kB video RAM.640×200 (128k)
EGAEnhanced Graphics AdapterIntroduced in 1984 by IBM. A resolution of 640×350 pixels of 16 different colors (4 bits per pixel, or bpp), selectable from a 64-color palette (2 bits per each of red-green-blue).640×350 (224k), 640×200 (128k), 320×200 (64k)64:35, 16:5 and 16:10/8:5 (all effectively 4:3)4 bpp
MCGAMulticolor Graphics AdapterIntroduced by IBM on ISA-based PS/2 models in 1987, with reduced cost compared to VGA. MCGA had a 320×200 256 color (from a 262,144 color palette) mode, and a 640×480 mode only in monochrome due to 64k video memory, compared to the 256k memory of VGA.320×200 (64k)
Video Graphics ArrayIntroduced on MCA-based PS/2 models in 1987. VGA is actually a set of different resolutions, but is most commonly used today to refer to 640×480 pixel displays with 16 colors (4 bits per pixel) and a 4:3 aspect ratio. Other display modes are also defined as VGA, such as 320×200 at 256 colors (8 bits per pixel) and a text mode with 720×400 pixels. VGA displays and adapters are generally capable of Mode X graphics, an undocumented mode to allow increased non-standard resolutions, most commonly 320×240 (with 8 bpp and square pixels).640×480 (307k) (hi-res graphics and LCD text)
SVGASuper Video Graphics ArrayA video display standard created by VESA for IBM PC compatible personal computers. Introduced in 1989. Displayed the regular VGA modes, plus 800×600 in 16 colours at a slightly lower 56 Hz refresh rate.
XGAExtended Graphics ArrayAn IBM display standard introduced in 1990. XGA added built on 8514/A"s existing 1024×768 mode and added support for "high color" (65,536 colour, 16 bpp) at 640×480. The second revision ("XGA-2") was a more thorough upgrade, offering higher refresh rates (75 Hz and up, non-interlaced, up to at least 1024×768), improved performance, and a fully programmable display engine capable of almost any resolution within its physical limits. For example, 1280×1024 (5:4) or 1360×1024 (4:3) in 16 colors at 60 Hz, 1056×400 [14h] Text Mode (132×50 characters), 800×600 in 256 or 64k colour, and even as high as 1600×1200 (at a reduced 50 Hz scan rate) with a high quality multisync monitor (or an otherwise non-standard 960×720 at 60 Hz on a lower-end one capable of high refresh at 800×600, but only interlaced mode at 1024×768).1024×768 (786k)
SXGASuper Extended Graphics ArrayA widely used aspect ratio of 5:4 (1.25:1) instead of the more common 4:3 (1.33:1), meaning even 4:3 pictures and video will appear letterboxed on the narrower 5:4 screens. This is generally the native resolution - with, therefore, square pixels - of standard 17" and 19" LCD monitors. It was often a recommended resolution for 17" and 19" CRTs also, although as they were usually produced in a 4:3 aspect it either gave non-square pixels or required adjustment to show small vertical borders at each side of the image. Allows 24-bit colour in 4 MB of graphics memory, or 4-bit in 640 kB.Some manufacturersde facto industry standard was VGA (Video Graphics Array), termed this the Extended Video Graphics Array or XVGA.1280×1024 (1310k)5:424 bpp
SXGA+Super Extended Graphics Array PLUSUsed on 14-inch (360 mm) and 15-inch (380 mm) notebook LCD screens and a few smaller screens, until the eventual market-wide phasing-out of 4:3 aspect displays.1400×1050 (1470k)4:324 bpp
WXGA+ (WSXGA)Widescreen Extended Graphics Array PLUSAn enhanced version of the WXGA format. This display aspect ratio was common in widescreen notebook computers and many 19" widescreen LCD monitors until ca. 2010.1440×900 (1296k)16:1024 bpp
HD+High Definition Plus (900p)This display aspect ratio is becoming popular in recent notebook computers and desktop monitors.1600×900 (1440k)16:924 bpp
WSXGA+Widescreen Super Extended Graphics Array PlusA wide version of the SXGA+ format, the native resolution for many 22" widescreen LCD monitors, also used in larger, widescreen notebook computers until ca. 2010.1680×1050 (1764k)16:1024 bpp
Full-HDFull High Definition (1080p)This display aspect ratio is the native resolution for many 24" widescreen LCD monitors, and is expected to also become a standard resolution for smaller to medium-size, wide-aspect tablet computers in the near future (as of 2012).1920×1080 (2073k)16:924 bpp
WUXGAWidescreen Ultra Extended Graphics ArrayA wide version of the UXGA format. This display aspect ratio was popular on high-end 15" and 17" widescreen notebook computers, as well as on many 23–27" widescreen LCD monitors, until ca. 2010. It is also a popular resolution for home cinema projectors, besides 1080p, in order to show non-widescreen material slightly taller than widescreen (and therefore also slightly wider than it might otherwise be), and is the highest resolution supported by single-link DVI at standard colour depth and scan rate (I.E. no less than 24 bpp and 60 Hz non-interlaced)1920×1200 (2304k)16:1024 bpp
QXGAQuad Extended Graphics ArrayThis is the highest resolution that generally can be displayed on analog computer monitors (most CRTs), and the highest resolution that most analogue video cards and other display transmission hardware (cables, switch boxes, signal boosters) are rated for (at 60 Hz refresh). 24-bit colour requires 9 MB of video memory (and transmission bandwidth) for a single frame. Also the native resolution of medium to large latest-generation (2012), standard-aspect tablet computers.2048×1536 (3146k)4:324 bpp
WQXGAWidescreen Quad Extended Graphics ArrayA version of the XGA format, the native resolution for many 30" widescreen LCD monitors. Also, the highest resolution supported by dual-link DVI at a standard colour depth and non-interlaced refresh rate (I.E. at least 24 bpp and 60 Hz). Requires 12 MB of memory/bandwidth for a single frame.2560×1600 (4096k)16:1024 bpp
WQUXGAWide Quad Ultra Extended Graphics ArrayThe IBM T220/T221 LCD monitors supported this resolution, but they are no longer available.3840×2400 (9216k)16:1024 bpp
8K UHD8K Ultra-high-definition (Super Hi-Vision)A digital format in testing by NHK in Japan (with a partnership extending to the BBC for test coverage of the 2012 London Olympic Games), intended to provide effectively "pixel-less" imagery even on extra-large LCD or projection screens.7680×4320 (33177k)16:930 bpp ~ 36 bpp
Shin, Min-Seok; Choi, Jung-Whan; Kim, Yong-Jae; Kim, Kyong-Rok; Lee, Inhwan; Kwon, Oh-Kyong (2007), "Accurate Power Estimation of LCD Panels for Notebook Design of Low-Cost 2.2-inch qVGA LTPS TFT-LCD Panel", SID 2007 Digest 38 (1): 260–263This article is based on material taken from the Free On-line Dictionary of Computing prior to 1 November 2008 and incorporated under the "relicensing" terms of the GFDL, version 1.3 or later.
The U.S. hardware maker recently leaked details on its new 24-inch, 4K monitor: the Dell UltraSharp 24 Monitor (UP2414Q). Spotted on Friday by BSN and other news outlets, the LED monitor features a 3,840 x 2,160 resolution display — a 185 pixel per inch (PPI) density — with nearly half the total surface area of 32-inch “Ultra HD” monitors from Dell, Sharp, Asus, and others.
A common size for LCDs manufactured for small consumer electronics, basic mobile phones and feature phones, typically in a 1.7" to 1.9" diagonal size. This LCD is often used in portrait (128×160) orientation. The unusual 5:4 aspect ratio makes the display slightly different from QQVGA dimensions.
Half the resolution in each dimension as standard VGA. First appeared as a VESA mode (134h=256 color, 135h=Hi-Color) that primarily allowed 80x30 character text with graphics, and should not be confused with CGA (320x200); QVGA is normally used when describing screens on portable devices (PDAs, pocket media players, feature phones, smartphones, etc.). No set colour depth or refresh rate is associated with this standard or those that follow, as it is dependent both on the manufacturing quality of the screen and the capabilities of the attached display driver hardware, and almost always incorporates an LCD panel with no visible line-scanning. However, it would typically be in the 8-to-12 bpp (256–4096 colours) through 18 bpp (262,144 colours) range.
Atari ST line. High resolution monochrome mode using a custom non-interlaced monitor with the slightly lower vertical resolution (in order to be an integer multiple of low and medium resolution and thus utilize the same amount of RAM for the framebuffer) allowing a "flicker free" 71.25 Hz refresh rate, higher even than the highest refresh rate provided by VGA. All machines in the ST series could also use colour or monochrome VGA monitors with a proper cable or physical adapter, and all but the TT could display 640x400 at 71.25 Hz on VGA monitors.
Commodore Amiga line and others, e.g. Acorn Archimedes, Atari Falcon). They used NTSC or PAL-compliant televisions and monochrome, composite video or RGB-component monitors. The interlaced (i or I) mode produced visible flickering of finer details, eventually fixable by use of scan doubler devices and VGA monitors.
The second-generation Macintosh, launched in 1987, came with colour (and greyscale) capability as standard, at two levels, depending on monitor size—512×384 (1/4 of the later XGA standard) on a 12" (4:3) colour or greyscale (monochrome) monitor; 640×480 with a larger (13" or 14") high-resolution monitor (superficially similar to VGA, but at a higher 67 Hz refresh rate)—with 8-bit colour/256 grey shades at the lower resolution, and either 4-bit or 8-bit colour (16/256 grey) in high resolution depending on installed memory (256 or 512 kB), all out of a full 24-bit master palette. The result was equivalent to VGA or even PGC—but with a wide palette—at a point simultaneous with the IBM launch of VGA.
Later, larger monitors (15" and 16") allowed use of an SVGA-like binary-half-megapixel 832×624 resolution (at 75 Hz) that was eventually used as the default setting for the original, late-1990s iMac. Even larger 17" and 19" monitors could attain higher resolutions still, when connected to a suitably capable computer, but apart from the 1152×870 "XGA+" mode discussed further below, Mac resolutions beyond 832×624 tended to fall into line with PC standards, using what were essentially rebadged PC monitors with a different cable connection. Mac models after the II (Power Mac, Quadra, etc.) also allowed at first 16-bit High Colour (65,536, or "Thousands of" colours), and then 24-bit True Colour (16.7M, or "Millions of" colours), but much like PC standards beyond XGA, the increase in colour depth past 8 bpp was not strictly tied to changing resolution standards.
The first PowerBook, released in 1991, replaced the original Mac Portable (basically an original Mac with an LCD, keyboard and trackball in a lunchbox-style shell), and introduced a new 640×400 greyscale screen. This was joined in 1993 with the PowerBook 165c, which kept the same resolution but added colour capability similar to that of Mac II (256 colours from a palette of 16.7 million).
Introduced in 1984 by IBM. A resolution of 640×350 pixels of 16 different colours in 4 bits per pixel (bpp), selectable from a 64-colour palette in 2 bits per each of red-green-blue (RGB) unit.DIP switch options; plus full EGA resolution (and CGA hi-res) in monochrome, if installed memory was insufficient for full colour at above 320×200.
Introduced by IBM on ISA-based PS/2 models in 1987, with reduced cost compared to VGA. MCGA had a 320×200 256-colour (from a 262,144 colour palette) mode, and a 640×480 mode only in monochrome due to 64k video memory, compared to the 256k memory of VGA.
The high-resolution mode introduced by 8514/A became a de facto general standard in a succession of computing and digital-media fields for more than two decades, arguably more so than SVGA, with successive IBM and clone videocards and CRT monitors (a multisync monitor"s grade being broadly determinable by whether it could display 1024×768 at all, or show it interlaced, non-interlaced, or "flicker-free"), LCD panels (the standard resolution for 14" and 15" 4:3 desktop monitors, and a whole generation of 11–15" laptops), early plasma and HD ready LCD televisions (albeit at a stretched 16:9 aspect ratio, showing down-scaled material), professional video projectors, and most recently, tablet computers.
An extension to VGA defined by VESA for IBM PC-compatible computers in 1989 meant to take advantage of video cards that exceeded the minimum 256 kB defined in the VGA standard. For instance, one of the early supported modes was 800×600 in 16 colours at a slightly lower 56 Hz refresh rate, leading to 800×600 sometimes being referred to as "SVGA resolution" today.
An IBM display standard introduced in 1990. XGA built on 8514/A"s existing 1024×768 mode and added support for "high colour" (65,536 colours, 16 bpp) at 640×480. The second revision ("XGA-2") was a more thorough upgrade, offering higher refresh rates (75 Hz and up, non-interlaced, up to at least 1024×768), improved performance, and a fully programmable display engine capable of almost any resolution within its physical limits. For example, 1280×1024 (5:4) or 1360×1024 (4:3) in 16 colours at 60 Hz, 1056×400 [14h] Text Mode (132×50 characters); 800×600 in 256 or 64k colours; and even as high as 1600×1200 (at a reduced 50 Hz scan rate) with a high-quality multisync monitor (or an otherwise non-standard 960×720 at 60 Hz on a lower-end one capable of high refresh rates at 800×600, but only interlaced mode at 1024×768).I, 640×480×16 NI, high-res text) were commonly used outside Windows and other hardware-abstracting graphical environments.
A widely used aspect ratio of 5:4 (1.25:1) instead of the more common 4:3 (1.33:1), meaning that even 4:3 pictures and video will appear letterboxed on the narrower 5:4 screens. This is generally the native resolution—with, therefore, square pixels—of standard 17" and 19" LCD monitors. It was often a recommended resolution for 17" and 19" CRTs also, though as they were usually produced in a 4:3 aspect ratio, it either gave non-square pixels or required adjustment to show small vertical borders at each side of the image. Allows 24-bit colour in 4 MB of graphics memory, or 4-bit colour in 640 kB.
An enhanced version of the WXGA format. This display aspect ratio was common in widescreen notebook computers, and many 19" widescreen LCD monitors until ca. 2010.
A wide version of the SXGA+ format, the native resolution for many 22" widescreen LCD monitors, also used in larger, wide-screen notebook computers until ca. 2010.
This display aspect ratio is the native resolution for many 24" widescreen LCD monitors, and is expected to also become a standard resolution for smaller-to-medium-sized wide-aspect tablet computers in the near future (as of 2012).
A wide version of the UXGA format. This display aspect ratio was popular on high-end 15" and 17" widescreen notebook computers, as well as on many 23–27" widescreen LCD monitors, until ca. 2010. It is also a popular resolution for home cinema projectors, besides 1080p, in order to show non-widescreen material slightly taller than widescreen (and therefore also slightly wider than it might otherwise be), and is the highest resolution supported by single-link DVI at standard colour depth and scan rate (i.e., no less than 24 bpp and 60 Hz non-interlaced)
This is the highest resolution that generally can be displayed on analog computer monitors (most CRTs), and the highest resolution that most analogue video cards and other display transmission hardware (cables, switch boxes, signal boosters) are rated for (at 60 Hz refresh). 24-bit colour requires 9 MB of video memory (and transmission bandwidth) for a single frame. It is also the native resolution of medium-to-large latest-generation (2012) standard-aspect tablet computers.
A version of the XGA format, the native resolution for many 30" widescreen LCD monitors. Also, the highest resolution supported by dual-link DVI at a standard colour depth and non-interlaced refresh rate (i.e. at least 24 bpp and 60 Hz). Used on MacBook Pro with Retina display (13.3"). Requires 12 MB of memory/bandwidth for a single frame.
The IBM T220 and T221 are LCD monitors with a native resolution of 3840×2400 pixels (WQUXGA) on a screen with a diagonal of 22.2 inch (564 mm). This works out as over 9.2 million pixels, with pixel density of 204 pixels per inch (80 dpcm, 0.1245 mm pixel pitch), much higher than ordinary computer monitors (which typically have about 100 pixels per inch) and approaching the resolution of print media. The display family was nicknamed "Big Bertha" in some trade journals.
The IBM T220 was introduced in June 2001 and was the first monitor to natively support a resolution of 3840×2400.DVI connectors, for a total of four DVI channels. One, two or four of the connectors may be used at once.
IBM T220 comes with a Matrox G200 MMS video card, and two power supplies. To achieve native resolution, the screen is sectioned into four columns of 960×2400 stripes. Under DOS mode, the monitor/card combination only supports 960×1200[citation needed] screen mode running at 56 Hz. The monitor"s native refresh rate is 41 Hz.
This is a revised model of the original T220. Notable improvements include using only one power adapter instead of two, support for more screen modes and driver support for Linux out of the box. However, power consumption increased from 111 to 135 watts (111 to 150 at maximum.) They were initially available as 9503-DG1 and 9503-DG3 models. The 9503-DG1 model came with a Matrox G200 MMS graphics card and two LFH-60 connector cables. The 9503-DG3 model came with a one cable, connecting from a single DVI port on the graphics card to the T221"s special sockets.[citation needed]
The 9503-DG1 model T221 originally ran at maximum resolution in four 960×2400 stripes. Later firmware permitted a 1920×1200 tile mode as well.[citation needed]
IBM T221 started out as an experimental technology from the flat panel display group at IBM Thomas J. Watson Research Center. In 2000, a prototype 22.2" TFTLCD, code-named "Bertha", was made in a joint effort between IBM Research and IBM Japan. This display had a pixel format of 3840×2400 (QUXGA-W) with 204 ppi. On 10 November 2000, IBM announced the shipment of the prototype monitors to U.S. Department of Energy’s Lawrence Livermore National Laboratory in California. Later in 2001-06-27, IBM announced the production version of the monitor, known as T220. Later in November 2001, IBM announced its replacement, IBM T221. On 19 March 2002, IBM announced lowering the price of IBM T221 from US$17,999 to US$8,399. Later in 2 September 2003, IBM announced the availability of the 9503-DG5 model.
IBM and Chi Mei Group of Taiwan formed a joint venture called IDTechViewSoniciiyamaOEMed the T221 and sold it under their brand names. The production line of IDTech at Yasu was sold to Sony in 2005Chi Mei has since demonstrated a 56" 3840×2160 QuadHDTV display.
Toshiba "announced" an exact clone of the IBM T221 in November 2007, but Toshiba did not produce it then, or in the (recession)years that followed. Neither Toshiba nor others on the Internet have offered any explanation.