Bongofish
November 19, 2017, 03:12:15 PM *
Welcome, Guest. Please login or register.

Login with username, password and session length
News: You can attach photos hosted by the forum rather than using an external image hosting site, this means they will stay forever and not disappear after a year or two.
 
   Home   Help Search Login Register  
Pages: [1]
  Print  
Author Topic: USB to VGA adapters, a possible palliative for DVI-induced Jitters.  (Read 4265 times)
DaBotz
Sr. Member
****
Posts: 227



View Profile WWW
« on: May 30, 2016, 07:30:18 PM »

I was re-reading the last message of Marcelo Aguiar over his built, the Cintia.

Over how he managed to have it usable, to then have to discard it because his new Laptop only had a DVI output, and with DVI the Cintia was unusable.

Now, this is not really a surprise - DVI are digital signals, which means that they are not required to have a "nice" waveform, and can pick up as much noise they want, as long as the logic levels are respected.

Also, we should remember the detail that the "ideal" digital signal - square - is supposed to have an infinite spectrum (composed by "spikes" at the harmonic frequencies of the signal's clock, in the envelope of the Fourier's transformed of a Dirac delta's... agh, whatever), which is the reason they can't be had in reality.

So, that DVI can stand in the way of using a DIY-q... it is almost natural.

At the time, it saddened me a little, but had no more to offer.

Anyway, ever since then, I came across the USB-to-VGA adapters and, I think these may help, in these cases.

I have two DisplayLink USB 2.0 to VGA adapters, and I use one of them as the VGA source for one of my builds.  


USB 3.0 should work similarly, however there are many cheap adapters of that kind ( starting  from some 15$) and, in my experience, they are unusable if you do not have a USB 3.0 port. I do not have any USB 3.0 port - my PC is that old.

Even though they advertise a USB2.0 capability, they simply have not enough hardware to make do with the less bandwidth allowed by USB 2.0 ( which is the reason why the USB 2.0 only adapters tend to cost quite a bit more... essentially, they are a compact video card reconstructing a VGA signal from a MPeg - or some lossless equivalent - stream made almost exclusively by B - difference - frames, whereas the 3.0 can do with a lot more A - integral - frames and need less intelligence in the adapter).  

Anyway, stop digressing and back to the chop...

As I said, I have a couple of these buggers - a lucky acquisition in ebay, used but, hardly ever, judging by the condition of the cases - and I have take a look a bit at how they perform.

No, they are not a good idea, to watch movies on.

It can be done, but...

Graphic board -> Image ->  Encoding -> send images by usb -> reconstruct the raster from the  stream -> reading the raster as VGA

It will add some latency (negligible, in most cases... and I am using computer 8 and 12 years old) and it will dig into your CPU, as most - if not all of this - will be done in software, on the PC side.


(Again, it digs a lot, for a movie seen at 1440x900, on my 8 years old AMD double core... probably, it would be a far less troubling quota of resources, on a more capable machine)



Movies can still be watched, but I'd say, gaming through these things is definitely a bad idea (the more of an image changes, the more data has to be encoded and shipped, the more latency is added  - and some of the latency is simply due to the data having to go through the USB at its maximum rate... ).

But drawing? drawing is a completely different matter.

Most of the time, it is about small changes (in my case, single lines - I  am not a colours lover, to be honest).

Losing a 10th of second opening a menu is not going to feel particularly sluggish (more so if you got used to Lazy Nezumi at, say, 10 t0 20 steps of moving average), and chances are you'll see even less than that.

So, if your build is well behaved in VGA and goes bonkers with DVI - HDMI, my suggestion is -  before scrapping it - to try one such adapter...


Note: there are some USB 3.0 adapters that also expose a USB 2.0 ancillary input...

Marcelo's build had all the size to put one such adapter inside it, hook up the Intuos to the USB 2, eventually using a 12-v to 5v to feed some more juice in the system , and his build would have needed only the USB and power cables... which would have been not that bad either.


 



« Last Edit: May 30, 2016, 08:18:29 PM by DaBotz » Logged

The most incredible artist of... Barbanza?
Ertew
Full Member
***
Posts: 103


View Profile
« Reply #1 on: May 30, 2016, 09:40:24 PM »

[...]
Now, this is not really a surprise - DVI are digital signals, which means that they are not required to have a "nice" waveform, and can pick up as much noise they want, as long as the logic levels are respected.
[...]
I must disagree with You.

First topic:
  • HDMI is digital-only (except some devices can push analog signal through HDMI connector).
  • VGA is analog-only (except i2c bus for monitor ID and some control).

What about DVI? Look here: https://pl.wikipedia.org/wiki/Plik:DVI_Connector_Types.svg
  • DVI-A is analog-only, same as VGA
  • DVI-D is digital-only, single-link version is equal for HDMI (except HDMI sucks at encryption)
  • DVI-I that can be found at most of video cards have both analog signals and single digital link, sometimes dual link too
So please check simply DVI -> VGA adapter that may be included to many graphic cards by manufacturer.

BTW second topic:
I appreciate whole info about USB -> VGA adapters and the idea of making Cintiq with only USB + power supply (USB hub should be included) but did You tried HDMI -> VGA converters? Alternative configuration DVI -> HDMI via simply adapter and HDMI -> VGA via converter should working too.
Logged

Any errors in spelling, tact or fact are transmission errors.

Hi! I'm a signature virus. Copy me into your signature to help me spread.
Pesho
Hero Member
*****
Posts: 264



View Profile
« Reply #2 on: May 31, 2016, 11:30:44 AM »

I avoid using VGA as much as i can, because the image quality is noticeably worse compared to DVI and HDMI. Besides, in the end you always have a digital signal at the LVDS or eDP line, so why convert back and forth between analog and digital? It makes more sense on a CRT since it's more or less an analog system.

I've read about DisplayLink before, Wacom's DTU-1031 tablet uses it. It's compact but doesn't really offer a significant benefit. Getting the OS to use the special software is likely to be problematic, and you don't really save that much space either. There's also the lag you mentioned... It's easier to just route HDMI, USB and Power into one physical cable anyway, the way it's done on the real Cintiqs. I'd rather opt for something like mini or even micro-HDMI to save space instead of USB.
« Last Edit: May 31, 2016, 11:33:51 AM by Pesho » Logged
DaBotz
Sr. Member
****
Posts: 227



View Profile WWW
« Reply #3 on: May 31, 2016, 12:07:38 PM »

DVI Analog is virtually VGA, true...

So the converter is just some passive system ( a set of resistors?)

DVI-D and HDMI are, electrically, the same, again.

So the HDMI-DVI converters are little (anything?) more than pin switchers between the two formats.

Now, one small problem is that the DVI exit of my PC graphic card, in analog mode... simply clones the VGA signal.

If I use a "passive adapter"... it just gives me a copy of the main screen. Which could be fine for some applications, a bit of a pain for others.

To have an extended desktop with two screens, I have to use a monitor with a proper, digital DVI input (and mine is not the only PC with that "habit", around me). 

Also, Marcelo had to switch to DVI when he bought a new Laptop.

I suspect that his new laptop had a HDMI exit (DVI takes even more space, on the side, than a VGA socket, and if a "sleek design" was the reason they omitted the VGA, they hardly placed a DVI), that he attached to the DVI input of his build through one of those simple adapters.

HDMI-VGA adapters seem a good idea, a lot less latency (all the work is done by the adapter, and should take milliseconds).

Anyway, I also like to have three screens, at times... (hubris is my middle name)  and my pc have no more than two video outputs.

On another side, while it is true that going back and forth from digital to analog to digital is a bit of a... idiotic course of action, the fact is that some have referred that with digital signals their builds become more jittery.

It is not a question of finesse, elegance or keeping things compact.

Personally, I do not care about elegance or keeping things slim as much as getting the build to be usable.

If with VGA one has a usable build, and with HDMI - DVI it is not, I'd say, to hell with image quality.

And unifying USB and screen signal, really, appeals more as a reduction of in-roads for external noise, to me... of course, that if the adapter itself does not prove to be a source of jitter.  

Alas, I am not going to test things, for a while... with 3 usable builds, I already am over-Cintiqized.

I admit a fleeting wish to rebuild the Cabinetiq with a better screen... but I shall refrain from it, for the moment

[Edited to fix some orthography and grammar screw-ups
« Last Edit: May 31, 2016, 01:22:00 PM by DaBotz » Logged

The most incredible artist of... Barbanza?
DaBotz
Sr. Member
****
Posts: 227



View Profile WWW
« Reply #4 on: May 31, 2016, 01:29:48 PM »

Video of a test draw with the MiniQ - https://vimeo.com/168676323
Logged

The most incredible artist of... Barbanza?
Ertew
Full Member
***
Posts: 103


View Profile
« Reply #5 on: June 06, 2016, 07:30:51 PM »

So the converter is just some passive system ( a set of resistors?)
Almost true. I tested several of these adapters and have not found any passive components. Just two connectors, lot of wires and some glue.


DVI-D and HDMI are, electrically, the same, again.

So the HDMI-DVI converters are little (anything?) more than pin switchers between the two formats.
True again. Just metal, plastic and glue.


Now, one small problem is that the DVI exit of my PC graphic card, in analog mode... simply clones the VGA signal.

If I use a "passive adapter"... it just gives me a copy of the main screen. Which could be fine for some applications, a bit of a pain for others.

To have an extended desktop with two screens, I have to use a monitor with a proper, digital DVI input (and mine is not the only PC with that "habit", around me). 
It's a problem with GPU chipset, not with DVI connector. Your graphic card have single analog output that's shared between VGA and DVI outputs.


Anyway, I also like to have three screens, at times... (hubris is my middle name)  and my pc have no more than two video outputs.
This is also typical problem for budget graphics. Many of them have 3 or 4 outputs and can switch signal to any output. But only two outputs can be used simultaneously. Same for intel integrated GPU on my machine Sad


On another side, while it is true that going back and forth from digital to analog to digital is a bit of a... idiotic course of action, the fact is that some have referred that with digital signals their builds become more jittery.
Digital -> analog (VGA) -> digital (LVDS) -> analog (signals inside LCD glass) is still better than USB things.
For me, USB have only two advantages:
- Universal plug that can be connected to any USB socket ie. on front panel. Regular video plug need to be connected to dedicated socket, often hidden deep on rear side of the PC.
- Converter act as a graphic card. You can omit the maximum number of monitors hooked to graphic card.

Logged

Any errors in spelling, tact or fact are transmission errors.

Hi! I'm a signature virus. Copy me into your signature to help me spread.
DaBotz
Sr. Member
****
Posts: 227



View Profile WWW
« Reply #6 on: June 06, 2016, 08:16:54 PM »

To be honest, most of the connectors that you can find on the front of PCs are USB 1.0 (or 1.1) only, being often really small hubs, and, as such, are not really able to handle that kind of adapter either.

(And some can't even push out some proper 5 volts... which is the reason why I added a powered hub to the UBiQ, although it had the odd side effect that, if I disconnect the UBiQ USB out and try to connect it to another PC, I also have to reset the hub by switching the screen off with UbiQ master switch, or it may go in lock  Shocked ).





 

Logged

The most incredible artist of... Barbanza?
Pages: [1]
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.21 | SMF © 2015, Simple Machines Valid XHTML 1.0! Valid CSS!