android touch screen monitors free sample
For example, on a device that is stable at a single touch, it is also easy to check the phenomenon becomes unstable when it comes to three or more points.
1.5.1 Responding to pen pressure.I was wearing a subtle color for each touch ID. (Five or more are repeated the same color.) Modify additional bug at full screen.
The best touchscreen monitors can offer advantages for certain workflows. Whether it’s for creative use or to improve general productivity tasks, the best touchscreen displays can make navigating certain programs more intuitive and more precise, particularly for tasks like making a selection in an image.
They can deliver a seamless, responsive experience that feels like writing with a pen on paper, and an immediacy that you don"t get with even the best mice to the best keyboards. But while touch screens now abound in phones and tablet, most monitors don"t offer touch. There are some excellent touch displays out there, however.
Below, we"ve made our pick of the best touchscreen monitors after evaluating a range of options for their accuracy and responsiveness, design, extra features and price. From regular-sized displays ideal for a desktop PC to portable monitors for those on the road, these are the best touchscreen monitors we"ve found.
If you prefer a more traditional monitor, possibly with a higher resolution, check out guides to the best monitors for photo editing and the best 4K monitors. If accurate colours are important to you, whether you’re a photographer or video editor, you might want to invest in one of the best monitor calibrator tools.
With so many options on the market, our choice of the best touchscreen monitors comes down to the details. And detail is something that Dell"s P2418HT monitor does brilliantly. This 1080p monitor on a 23.8-inch panel boasts an LCD screen to deliver excellent resolution, contrast, and colour. Moreover, it boasts an anti-glare surface that works beautifully in distracting light conditions as well as ultra-thin bezels that give it a stylish flair and you more screen real estate.
Looking for a cheap touchscreen monitor from a reputable brand? The 21.5in Dell P2219H IPS monitor is available at a brilliant price, and it still does an impressive job, making it one of the best touchscreen monitors available for those on a tighter budget.
While creative professionals usually go for larger screens, there’s definitely a place for portable monitors in content creation. Nomadic users in particular can benefit from a portable monitor that’s designed specifically with video editors, designers, and music producers in mind.
The ProArt Display PA148CTV is something of a rarity in the sea of portable monitors with its robust set of features targeted towards creatives. They include the Asus Dial, a physical dial that you can use to make effortless adjustments to your project whether you’re in Lightroom, Premiere Pro, or Photoshop. There’s also the Virtual Control Panel function, which allows you to use the display itself as your touchscreen control panel, simplifying your workflow.
The ViewSonic TD2230 is small, light and portable touchscreen monitor, making it perfect for anyone with limited desk space and/or who needs to travel with their screen. The 22in, Full HD, IPS display offers beautifully sharp image quality and high visual accuracy. The screen is also scratch-poof, and the bookstand design allows it to be tilted and adjusted from 20 to 70 degrees, or rested flat.
The connection ports are all on the side of the monitor, offering easy access. You get HDMI, DisplayPort and VGA and USB connectivity. The monitor offers low power consumption – great for both your pocket and the planet. The colours are a little dull, but overall this is an excellent buy for anyone looking for a portable touchscreen monitor.
The Philips 242B9T is another good touchscreen monitor. It might not be the most stylish looking touch monitor but it has an awful lot to offer. For a start, it comes with built-in 2W speakers. Also, you can connect it to a wide range of devices via HDMI, DVI, VGA and DisplayPort.
The Asus VT229H comes with many features you’ll find on most touchscreen monitors, including 10-point multi-touch capacity, 178/178 viewing angles, flicker-free backlighting, and blue light filter to make it easy on the eyes. However, it also boasts a few extras you won’t find on rival displays, and these help make your workflow more seamless.
Want a larger touchscreen monitor? Most touchscreen monitors tend to be on the smaller side, but this 27in offering from Planar offers a relatively large IPS LED-backlit display. While Full HD is a little thin for a 27in display, the screen offers dependable color accuracy and contrast that won"t shift depending on where you"re looking.
It"s a versatile monitor too, with a built-in HD webcam and microphone, making it great for home office working and video conferencing. It boasts 10-point capacitive multi-touch and an ergonomic stand that can take the display from completely flat to a 70-degree tilt.Is it worth buying a touchscreen monitor?If you’ve ever used a touchscreen laptop and wished you could do the same at your desk, then the slightly higher price of a touchscreen monitor over its non-touch counterpart is well worth it. After all, there’s no other way to get that kind of nuanced control when navigating various windows and apps. For example, if you want to translate handwriting to text or draw directly on the screen using your finger, one of these panels is the way to do it. And, instead of having to use keyboard shortcuts to carry out a command, you can perform the actual action directly on the screen.
But, you won’t be holding a touchscreen display the way you would a tablet or smartphone. So, consider whether you’re comfortable using your fingers to navigate a screen that’s sitting at eye level for long periods.What are the disadvantages of a touchscreen monitor?There are some drawbacks to using a touchscreen monitor. For example, holding your arm up to interact with a touchscreen throughout a day’s worth of work will get tiring no matter how strong you are. And, you’ll have to clean that screen regularly. Otherwise, that buildup of smudges and fingerprints can get in the way of seeing it properly.
Most importantly, however, touchscreen displays are more likely to experience some kind of damage. This is because there’s much more going on under the hood than with non-touch monitors. And, when something goes wrong, it will cost more to fix due to the more complicated design of these panels.What is a 10-point touchscreen?A 10-point touchscreen can register 10 distinct points of contact all at once. Not all touchscreen devices and displays utilise this technology. But, having it makes a huge difference in the accuracy of your taps, swipes, and various gestures. With one or two-point touchscreens, any accidental contact like the edge of your palm grazing the screen could confuse the interface and register a tap where it shouldn’t be. Utilising a 10 point touchscreen largely eliminates that kind of accidental interaction. And, it also allows for more complex interactions like typing directly on the screen.Can I use a touch screen monitor with any computer?Yes, you can use a touch-sensitive screen with any PC, or even a laptop. provided it has the right ports to connect with your machine. Check what ports your device has, but most touchscreen monitors will have several, including USB and HDMI.
Finally, a user-friendly paperless device. Digital documents are right there in portrait mode for quick cross-referencing and editing is made easy with copy-paste functionality across different screens.
Our Android Displays are commercial models, specially designed and to use in a retail environment, online digital signage screens, info touch kiosk, etcetera.
Now that we"ve completed our first Android app, let"s explore a device feature that has become particularly popular with mobile phones and tablets—multitouch. Virtually all Android devices ship today with a capacitive touch screen panel. It"s a device we"ve gotten so accustomed to that we hardly "see" it as the hardware sensor that it is.
User interaction (UI) with Android touch screens differs somewhat from that of a mouse on a traditional computer display. First of all, we don"t have one omnipresent mouse pointer for interacting with UI elements via rollovers, clicks, right-clicks, and double-clicks. In fact, we don"t have a rollover or a physical "click" on the touch screen panel at all, hence UI interactions often require adjustments for the touch screen. Typically the Android device uses audiovisual cues such as click sounds or small device vibrations for user feedback.
There are a number of advantages to the multitouch screen interface to point out. First and foremost, the capacitive touch screen panel affords us more than one mouse pointer. We can work with two, five, even ten fingers on the Android, although more than three are rarely used. Multitouch allows us a variety of distinct finger gestures compared to the mouse, which we can only use to interact with the UI elements and other components displayed on the screen. The two most common multitouch gestures are the pinch and rotate gestures, typically used for scaling and rotating objects on the screen.
In this chapter, we"ll get started by learning to use the mouse callback methods available in Processing for Android. Then we"ll dive into the different color modes Processing has to offer, an essential topic that we need to address to work with graphics and images throughout the book. Building on the basic drawing sketch, we"ll use the mouse speed to manipulate the hues of the ellipses we draw.
Finally, we"ll dedicate the second part of the chapter to the multitouch features of the Android touch screen and create a sketch that showcases the most common gestures, including the tap, double-tap, long press, flick, pinch, and rotate gestures. In the sketch we"ll develop, we"ll manipulate the scale, position, rotation, and color of a rectangle using multitouch gestures.
To make working with multitouch gestures easy, we"ll use the Ketai library for Processing, which greatly simplifies the process. We"ll work with Ketai throughout the book, as it also simplifies working with sensors, cameras, location, and networking—all the hardware features that are typically difficult to work with. We"ll download and install the library step by step and take a quick look at the main Ketai classes.
The capacitive touch screen panel of an Android device consists of a glass insulator coated with a transparent conductor. When we interact with the touch screen surface, our fingertips act as electrical conductors—not very good ones, but good enough to be detected. A touch on the screen surface distorts the electrostatic field, causing a change in its electric capacitance, which can be located relative to the screen surface. The horizontal and vertical position of the fingertip relative to the screen is then made available to us through the Android OS; it is updated only when we touch or move a fingertip across the screen.
The apps we write in Processing have a flexible screen orientation by default, which means our app switches orientation automatically from portrait to landscape depending on how we are holding the phone or tablet–this is detected by the accelerometer sensor we"ll get to know in Display Values from
For compatibility, Processing uses the constants mouseX and mouseY when it"s running in Android mode, corresponding in this case to the position of a user"s fingertip relative to the upper left corner of the device touch screen rather than the position of the mouse cursor on a desktop screen. This allows us to use the same code across modes. When using mouseX in Android mode, we refer to the horizontal position of the fingertip on the touch screen panel, and when we use mouseY, we refer the fingertip"s vertical position. Both are measured relative to the coordinate system"s origin in the upper left corner of the touch screen. Moving the finger to the right on the screen will increase mouseX values; moving the finger down will increase mouseY.
In Android mode, we can also use the following mouse methods, which are available in all Processing modes. The Android touch screen gestures correspond to the following mouse events:
mousePressed() This callback method is called every time a finger touches the screen panel. It corresponds to a mouse-pressed event on the desktop when the mouse button is pressed down.
mouseReleased() This callback method is called every time a finger lifts off the touch screen surface, but only if its position has changed since first touching the panel. It corresponds to a mouse-up event on the desktop.
mouseDragged() This callback method is called every time a new finger position is detected by the touch screen panel compared to the previously detected position. It corresponds to a mouse-dragged event on the desktop when the mouse moves while the button is pressed.
All three methods respond only to one finger"s touch. When you use more than one finger on the multitouch surface, the finger that triggers callback events is the first one that touches the screen panel—the second, third, or more are ignored. If you hold down one finger on the screen surface, add another one on, and remove the first, then the second finger one will now be first in line and take over mouse events. We will work with multiple fingers and multitouch gestures in just a bit in Detect Multitouch Gestures.
Create a new Android sketch by choosing File ↦ New from the Processing menu. If your new sketch window is not yet in Android mode, switch it to Android using the drop-down menu in the upper right corner. Add a few lines of code to the sketch window:
With your Android device connected to your desktop via a USB cable, run the sketch on the device by pressing the "Run on Device" button in the sketch window. When the sketch is installed and launched on the device, we don"t need to pay attention to the screen output of the touch screen panel, but keep an eye on the Processing console at the bottom of the sketch window.
Hold your device in one hand and get ready to touch the screen surface with the other. Take a look at the console and tap the screen. In the console, you"ll see output similar to this:
Lift your finger and see what happens. If you see no additional mouse event, don"t be surprised. Although we might expect a RELEASED here, we shouldn"t get this event if we just tap the screen and lift the finger. The mouseX and mouseY constants always store and maintain the last mouse position. To get a mouse-released event, touch the screen, move your finger a bit, and release. Now you should see something like this:
Because we touched the screen, we first trigger a mousePressed() event. By moving the finger slightly while touching the surface, we trigger mouseDragged() until we stop moving. Finally, we get a mouseReleased() event because we"ve updated our position since we pressed or touched the screen.
Now that we can now work with the mouse callback methods, we"re ready to take a look at the color support that Processing provides, which is one of its strengths. Knowing how to control color values is a fundamental skill that we"ll frequently return to as we work with graphics and images throughout the book. We"ll come back to the Android touch screen and its multitouch features later in this chapter.
Any geometric primitive we draw on the screen uses a particular fill() and stroke() color. If we don"t say otherwise, Processing will default to a black stroke and a white fill color. We can use the fill() and stroke() methods to change default values, and we can also use grayscale, RGB, HSB, or hexadecimal color in the Android apps we create. The background() method uses color in the same way, with the exception that it cannot set a value for opacity, formally known as the alpha value.
By default, Processing draws all graphic elements in the RGB (red, green, blue) color mode. An additional alpha value can be used as a fourth parameter to control the opacity of graphic elements drawn on the screen. An alpha value of 0 is fully transparent, and a value of 255 is fully opaque. Values of 0..255 control the level of opacity for an individual pixel.
Now that we"ve learned about the different color modes, methods, and types available to define colors in an Android app, let"s refine our previous drawing sketch.
Now let"s explore the HSB mode on the device touch screen display. By adding colorMode() to our sketch, we switch the color mode, and by modifying our fill() method to work with HSB values, we change our app from grayscale to shades of color. Here"s the result:
The hue of the ellipses changes depending on how fast you move your finger across the touch screen surface. Slow movements result in greenish, medium in blueish, and fast movements in reddish values.
In this project, we"ll keep the screen orientation() flexible, which is the default setting for our Processing apps, and we don"t have to set anything to make the app change orientation when we hold the device upright or sideways. This means it will change orientation when the built-in device accelerometer sensor decides that the app should adapt to the particular orientation at that moment. When such an orientation change occurs, our setup() method will be called again, reinitializing the sketch and executing all the statements we"ve included in setup(). Because we set the screen to black, erasing its contents using the background(0) method, a change in the app"s orientation will reset the background() to black, erasing all the ellipses we"ve drawn prior to changing the orientation.
Rerun the sketch on the device now switched to HSB color, and see how the hue of the ellipses that are drawn changes depending on how fast you move your finger across the screen surface. The changes in the hues occur independently of their saturation and brightness, so all the colors are drawn with maximum saturation and brightness.
If you go back to the HSB color wheel we looked at earlier, you will see how the 360 degrees of the HSB or the wheel correspond to the different colored ellipses you draw on the screen. Slow movement results in yellow color values, and then, with increasing speed, you"ll see green, cyan, blue, and magenta as the maximum speed values.
Now that we"ve mastered the use of color in Processing, let"s continue our investigation into the multitouch screen panel. We"ll go ahead and install a Processing library that will help us work with multitouch gestures and extend the core features Processing provides us. Besides multitouch, the Ketai library makes it easy for us to work with other hardware devices and sensors built into Android phones and tablets. Let"s take a look at the Ketai classes and the features it provides. We"ll be using these throughout the rest of the book.
The Ketai library for Processing focuses particularly on making it easy to work with the mobile hardware features built into Android phones and tablets. The term "Ketai" is used in Japan to describe its cell phone culture, enabled by mobile handheld devices. The mobile device translates as Keitai Denwa and literally means "something carried in the hand," or handheld. The Ketai library is free software published under the GNU General Public License Version 3 (GPL v3), and it is compatible with Android versions 2.3 Gingerbread, 3.0/3.1 Honeycomb, 4.0 Ice Cream Sandwich, 4.1/4.2/4.3 Jelly Bean, 4.4 KitKat, and 5.0 Lollipop. NFC, Wi-Fi Direct, and updated camera features introduced in 4.0 Ice Cream Sandwich are not available in Gingerbread or Honeycomb. Therefore the Ketai library is available as separate downloads for Gingerbread/Honeycomb and for Ice Cream Sandwich/Jelly Bean. Please refer to Run a Sketch on an Android Device, to find out which version of Android you are running on your device.
Libraries are arguably one of the most successful aspects of the open source Processing project. There are more than 130 libraries available for Processing; however, on the Android device we can only use those Java libraries that do not make use of desktop hardware. Libraries extend the easy-to-learn Processing core with classes written for particular contexts, including 3D, animation, compilations, computer vision, data and protocols, geometry, graphic interface, hardware interface, import and export, math, simulation, sound, tools, typography, and video—to name the main categories listed on the Processing website, where the libraries are organized.
Many interface, sound, computer vision, and import/export libraries use code that is specific to the desktop context and are not designed for use on Android devices. Many libraries that could be compatible with Processing for Android are currently updated by library authors to eventually be available for us to use in Android mode. The Library Manager added to Processing 2.0 makes it easy to install libraries from within Processing. We"ll use it to install the Ketai library in Install the Ketai Library, below.
While there are scores of Processing libraries, only a small number of them work on Android phones and tablets. The Ketai library is designed particularly to provide programmer access to Android sensors, cameras, and networking; it is the only library that has been developed to run solely in Android mode.
I"ve been working on the Ketai library with Jesus Duran since 2010, with the objective to make it really easy to write apps that can effectively use the mobile hardware features built into Android phones and tablets. Convinced by the idea that phones and tablets evolve rapidly alongside the open source Android OS, the Ketai library makes it possible to consider such devices as a great complement to microcontrollers such as the Arduino—an open hardware sister project to Processing that is built on the same IDE.
Besides their compact form factor, multicore Android phones and tablets are computationally quite powerful, are equipped with a wide range of sensors, and run an operating system that is open source, free, and doesn"t require subscriptions—characteristics that are advantageous to innovation, academic use, and DIY culture. What"s more, once a mobile phone or tablet is outdated, it remains an inexpensive device, available in abundance and way too functional for a landfill.
The Ketai library values conciseness and legibility in its syntax and makes hardware features available using just a few lines of code. For example, the simple code we use for our accelerometer project (code available here) uses less than thirty lines of code altogether, while the Java sample included in the Android SDK completes the task with more than one hundred lines of code. This ratio increases significantly with more complex subjects such as Chapter 7, Peer-to-Peer Networking Using Bluetooth and Wi-Fi Direct, and Chapter 8, Using Near Field Communication (NFC), where Ketai is significantly more concise and easier to understand than the SDK.
Ketai includes a number of classes that make Android hardware sensors and devices available within Processing. The following classes are included in the library, described in more detail in ketai classes, and explained within the relevant chapters:
KetaiGesture gives us access to the most common multitouch gestures used on mobile devices. It provides us with the callback methods that we need to highlight, scale, drag, and rotate objects and UI elements. To select, zoom, focus, and organize the elements we display on the touch screen, we can use a number of gestures that have become user interaction standards on mobile devices. Working off established UI standards, we can build apps that are more intuitive to use and that enable the user to get the job done quickly while on the move.
Using the KetaiGesture class, we can work with the following callback methods, which report back to us when a certain event has occurred on the touch screen surface, which was triggered by a particular user interaction or multitouch gesture.
onTap(float x, float y)Single Tap—triggered by one short tap on the device screen. Returns the horizontal and vertical position of the single-tap gesture.
onLongPress(float x, float y)Long Press—triggered by tapping and holding a finger at one position on the touch screen for about one second. Returns the horizontal and vertical position of the long press gesture.
onFlick(float x, float y, float px, float py, float v)Flick—triggered by moving a finger in any direction, where the beginning and the end of the gesture occur at two different screen positions while the finger doesn"t come to a full stop before lifting it from the screen surface. Returns the horizontal and vertical position where the flick is released, the horizontal and vertical position where the flick started, and the velocity of the flick.
onRotate(float x, float y, float angle)Rotate—triggered by the relative change of the axis rotation defined by two fingers on the touch screen surface. Returns the centroid of the rotation gesture and the relative change of the axis angle.
For this project, we"ll implement the most common user interactions using just one simple geometric primitive—a rectangle—drawn on the screen using Processing"s rect(x, y, width, height) method. To begin, we"ll place a rectangle in a specified size of 100 pixels in the center of the screen. Then we use a series of KetaiGesture callback events to trigger changes to the rectangle, including a change of scale, rotation, color, and position, as illustrated in Figure 2.2.
We have a number of callback events for the touch surface to try out, so we"ll assign each of them with a particular purpose. We"ll zoom to fit the rectangle onto the screen using onDoubleTap(), randomly change its fill color onLongPress() using Processing"s random() method, scale it onPinch(), rotate it onRotate(), drag it using mouseDragged(), and change the background color onFlick(). Besides manipulating color properties and the rectangle, we"ll keep track of the multitouch events as they occur by printing a text string to the Processing Console. The code we use to manipulate the properties and the callback methods themselves are not complicated in any way, but we"re now dealing with a bit more code than we have before because we"re using a series of callback methods in one sketch.
For this project, we"ll lock our app into LANDSCAPE orientation() so we can maintain a clear reference point as we discuss 2D transformations in reference to the coordinate system. To center our rectangle on the screen when we start up, to scale from its center point using the pinch gesture, and to rotate it around its center point using the rotate gesture, we need to work with two-dimensional (2D) transformations.
A common metaphor to explain 2D transformations is a grid or graph paper. Using this analogy, each grid cell stands for one pixel of our app"s display window. The default origin in Processing"s coordinate system is always the upper left corner of the window. Each graphic element is drawn relative to this origin onto the screen. To move and rotate our rectangle, we"ll use Processing"s transformation methods: translate() and rotate(). We also have a scale() method, which we won"t use in this sketch.
Like our first project in this chapter, in which we used Processing"s mousePressed(), mouseReleased(), and mouseDragged() callback methods to identify touches to the screen, some of the multitouch gestures introduced here fulfill the same purpose. If we"d like to use Processing"s mouse methods alongside multitouch methods provided by KetaiGesture, we"ll need to notify the superclass method surfaceTouchEvent() to notify the Processing app that a surface touch event has occurred.
Use Processing"s mouseDragged() callback to update the rectangle position (x and y) by the amount of pixels moved. Determine this amount by subtracting the previous pmouseX from the current mouseX, and pmouseY from mouseY. Move the rectangle only if absolute distance between the rectangle and the mouse position is less than half the rectangle"s size, or when we touch the rectangle.
Run the app on your device. You"ll see a square show up in the center of the screen. Drag it to a new location, flick to change the background color, and give it a long tap to change the foreground fill color.
To test the multitouch gestures, put two fingers down on the screen and pinch, and you"ll see how the rectangle starts scaling. Now rotate the same two fingers to see the rectangle rotate. If you use more than two fingers, the first two fingers you put down on the screen are in charge.
You"ve used the touch screen panel as the first hardware device we"ve worked with. You"ve learned about mouse speed and all the different color features in Processing and worked with the HSB color mode to manipulate the hue values of the geometric primitive we"ve drawn. And finally, you are now able to use mouse events and multitouch gestures for your app"s user interfaces to control the object you display on the device screen.
You are now well positioned to move on to the next chapter, where we"ll focus on the hardware sensors built into Android devices. After all, the "native" user experience of mobile apps relies heavily on hardware devices and sensors, and we are now ready to incorporate them into our apps.
If you travel with a laptop and iPad, you need this app. I needed a second screen, but Duet gives me even more. Full gesture support, customizable shortcuts, Touch Bar, tons of resolution options, and very little battery power. How is this all in one app?
I just love this app. Especially when I am travelling for work an working from the company branches. Then I use my iPad as second monitor for Outlook, Lync and other chat while I use the laptop big screen for remote desktop to my workstation at the main office. :)
Because of this, touchscreen displays will play an important role in allowing digital collaboration between hybrid teams to continue in a seamless way. Up until a few years ago, touchscreens were clunky devices that were too complex and difficult to use. But improvements in the accuracy of the touch interface have enhanced the user experience greatly. You can now use them with the familiarity of your personal touch devices.
There are a number of great reasons for using digital devices in your sessions, but choosing which touchscreen display to get is not that straightforward. The touchscreen display market is becoming increasingly crowded with many different brands, screen technologies, sizes, and extras to choose from. And more importantly also the space, software, methods, and people need to be taken into account before such an investment is made.
For a smooth MURAL experience we recommend using a good graphics card and at least 8GB of RAM (but preferably 16GB of RAM). This will make sure the touch experience is smooth.
Try quick drawing and moving sticky notes in MURAL to see how the latency (delay) of the screen is. The less latency the better, as we’ve seen screens not being used because of their high latency.
By asking yourself each of the questions below, you can make sure that you are making an investment in a touchscreen display that will allow your hybrid, remote, and in-person teams to collaborate effectively.
Start with the question: what is the main purpose and use case for this touchscreen display? Too often we have seen touchscreens collecting dust because the purpose for using it wasn’t clear from the beginning.
Work here is often done in break-out groups of five to eight people where they will work on one screen to solve a problem using a variety of techniques. This set-up is very flexible and groups are often moved around different areas of the space. Touch experience, multi-touch (allows multiple people to work on the screen at the same time) and movability of the device are very important here. The screen size can also be a bit smaller. For instance, you can even do breakout group work on a 27” touchscreen monitor set up on a hightop table.
The only person interacting with the screen will be a presenter or facilitator, to display their presentation, synthesize work and make annotations. The larger the screen the better here. Sometimes it is even connected to a large projector for bigger groups. Multi-touch experience is not crucial. And the screen might be in a fixed position.
Touchscreen displays typically come in screen diameters of 55”, 65”, 75” and 85”. But also 27” touchscreen monitors such as the Microsoft Surface Studio could work well.
If we bring this back to the use cases mentioned earlier, break-out sessions in innovation labs are often best served by a 55-65” screen, but the 27” Microsoft Surface Studio on top of a high table works just as well. For presentation rooms you would almost always need to go for the largest screen size, or even connect it to a projector with a larger screen size. And meeting rooms could best use a 55-75” screen, depending on the size of the room.
We’ve seen large touchscreen displays mounted to a wall in a meeting room where it was just being used for screen sharing like a normal TV screen. You want to avoid that.
The advantage of having the screen on a rolling stand is that its users will be able to work in the best places to support whatever task is at hand. It allows people to be more agile and creative. Which is essential for teams in innovation labs. But it can also provide more flexibility to meeting and presentation rooms, as you are likely not going to equip every single room with a touchscreen display.
Reasons for why touchscreen displays are fixed to a wall are because of saving space and aesthetics. But also IT might not want to have expensive equipment moving around. And the extra cost of a rolling stand might weigh in sometimes too.
Next to a big screen on a rolling stand, a smart projector like Adok is an interesting solution to provide a touch ‘screen’ in a portable format, that can be set up anywhere.
Built in PCs can run on Windows 10, but regularly run on an OS (Operating System) specific to the manufacturer (based on Android). These PCs are often very entry-level and just powerful enough for basic software.
OPS PC’s are shipped with every Operating System PC’s can have and are usually the norm for corporate clients, for example to run Windows 10 that is standard across other devices. Next to that these PCs are added to have more power in order to run more advanced software and have a better touch experience. Just like built-in PCs these can be set up both in PC/Desktop or Kiosk mode.
This is a new concept for touchscreen displays that is gaining popularity recently. Here the display is basically just an empty screen without any operating system. You just connect your own device to make use of the screen. It’s plug & play to project your device’s display on the big screen and use its touchscreen to control. This has two advantages. No worries for IT because there is nothing to secure on the touchscreen display. And ease of use for the user because they know very well how to operate their personal device.
The Microsoft Windows Collaboration Displays developed by Avocor and Sharp are great examples of this. But you can essentially connect your personal device to any touchscreen display. You can for example connect your device running Windows 10 to a Samsung Flip (that has an Android based OS) to run the apps you need.
Some touchscreen displays come with webcam and microphone integrated. Others are easily upgradeable with a camera plus microphone such as the Huddly. Both of these options work great when collaborating with a smaller team close to the screen.