Anyone ever done "hello world" through a modern GPU?

Programming, for all ages and all languages.
User avatar
LieutenantHacker
Member
Member
Posts: 69
Joined: Sat May 04, 2013 2:24 pm
Location: Canada

Re: Anyone ever done "hello world" through a modern GPU?

Post by LieutenantHacker »

Well, sorry for not checking that, but it still doesn't clear up the conception on limits of the controller's resolution being superseded.

For example, if resolution isn't dependent on the GPU, why do people say you NEED the GPU to achieve higher resolution? Still a few things haven't been cleared yet.

The term GPU can also be used in different contexts. Does N64's RCP count as a GPU? Do people tend to only use GPU when referring to "3-D" output?

I want to know the real details here, no abstractions, as this still baffles me after years;

GPU doesn't output graphics ... so the "framebuffer" hardware output and is the limitation of the output then ... then GPU is not needed for HD output ... since GPU does not drive a video signal.
The desire to hack, with the ethics to code.
I'm gonna build an 8-bit computer soon, with this as reference: http://www.instructables.com/id/How-to- ... -Computer/
Octocontrabass
Member
Member
Posts: 5587
Joined: Mon Mar 25, 2013 7:01 pm

Re: Anyone ever done "hello world" through a modern GPU?

Post by Octocontrabass »

LieutenantHacker wrote:Does N64's RCP count as a GPU?
You're thinking of two components of the RCP. One, the RSP, is a 32-bit MIPS R4000 CPU with 8192 bytes of RAM and some SIMD instructions bolted on. The other, the RDP, is a rasterizer. Most programmers used them in tandem as a GPU, although they are entirely independent of one another.

The RCP as a whole is a complete chipset - it includes a memory controller, an expansion bus controller, a video generator, an audio generator, and of course the aforementioned RSP and RDP.
Rew
Member
Member
Posts: 28
Joined: Mon Oct 29, 2012 2:26 pm

Re: Anyone ever done "hello world" through a modern GPU?

Post by Rew »

The GPU is not "needed" for any graphics. There is hardware that takes a chunk of memory and transmits it to the monitor. It is just pixel data at this point. All you need is something to translate the pixel data and something to alter the pixel data. A CPU and GPU, are both able to perform calculations and modify the "framebuffer".

The challange is that the CPU tends to not be fast enough to do the kinds of calculations that graphics require in a reasonable time. Additionally, many calculations in graphics can be done in parallel. Also, the constant writing to the framebuffer could benefit from a different set of optimizations than standard cpu memory. So, rather than using every available clock cycle to force your cpu to do something it is not good at, we came up with a processor that is really good at the types of calculations and memory access patterns that are required for graphics programming. All the GPU does is make quicker what could be done in the CPU.

LieutenantHacker wrote:Well, sorry for not checking that, but it still doesn't clear up the conception on limits of the controller's resolution being superseded.

For example, if resolution isn't dependent on the GPU, why do people say you NEED the GPU to achieve higher resolution? Still a few things haven't been cleared yet.

The term GPU can also be used in different contexts. Does N64's RCP count as a GPU? Do people tend to only use GPU when referring to "3-D" output?

I want to know the real details here, no abstractions, as this still baffles me after years;

GPU doesn't output graphics ... so the "framebuffer" hardware output and is the limitation of the output then ... then GPU is not needed for HD output ... since GPU does not drive a video signal.
User avatar
Combuster
Member
Member
Posts: 9301
Joined: Wed Oct 18, 2006 3:45 am
Libera.chat IRC: [com]buster
Location: On the balcony, where I can actually keep 1½m distance
Contact:

Re: Anyone ever done "hello world" through a modern GPU?

Post by Combuster »

The problem is that many people wrongly substitute "GPU" when they mean the entirety of the video card.
"Certainly avoid yourself. He is a newbie and might not realize it. You'll hate his code deeply a few years down the road." - Sortie
[ My OS ] [ VDisk/SFS ]
User avatar
LieutenantHacker
Member
Member
Posts: 69
Joined: Sat May 04, 2013 2:24 pm
Location: Canada

Re: Anyone ever done "hello world" through a modern GPU?

Post by LieutenantHacker »

A video card and GPU can be used synonymously, and many people do use terms that way.

But if the GPU just "rapidly manipulates data for the framebuffer", how can I find out maximum video output resolution then? I always never bothered because people tell me all the time that you "need a GPU for HD graphics"; 1920x1080.

But if you are correct in saying that the GPU does not determine maximum resolution and the framebuffer does, where can I find info about my framebuffer then?
The desire to hack, with the ethics to code.
I'm gonna build an 8-bit computer soon, with this as reference: http://www.instructables.com/id/How-to- ... -Computer/
User avatar
sortie
Member
Member
Posts: 931
Joined: Wed Mar 21, 2012 3:01 pm
Libera.chat IRC: sortie

Re: Anyone ever done "hello world" through a modern GPU?

Post by sortie »

Then perhaps you should stop taking the advise of such people over operating system's developers? :-)

Really, the reason why you need a fast enough video card for `HD graphics' is partially that it needs to be fast enough to send data to the screen, but mostly that 3D games and other applications will take more effort to render in higher resolutions. However, the CRTCs is what matter when dealing with video output and framebuffers. If the actual GPU is not used, then you can simply think of the video card as a bunch of video memory and a CRTC. You then tell the CRTC to transmit video data from a particular area in the video buffer, this being the framebuffer. Naturally, to set up the screen and learn what resolutions are available, you'll want to talk to the CRTC as well. (Not entirely sure about the details).
User avatar
Combuster
Member
Member
Posts: 9301
Joined: Wed Oct 18, 2006 3:45 am
Libera.chat IRC: [com]buster
Location: On the balcony, where I can actually keep 1½m distance
Contact:

Re: Anyone ever done "hello world" through a modern GPU?

Post by Combuster »

I think the first important thing now is that you take a step back from the mess and decide for yourself what you really want:
- Do you simply want to have a high resolution and put pixels on the screen?
- Do you want to have an operational blitter?
- Do you want a working 3D engine?
- Overlay engine?
- Hardware mouse cursor?
- OpenGL support?
- Hardware video decoding?
- Do you want to change video modes on the fly?
- Do you want arbitrary resolutions with arbitrary refresh rates instead of a few predefined modes?
- Do you want to have multiple screens?

And most importantly, what do you need? I assume that like the majority of people here, what you really need is just the first item on that list.

And if I'm right, what you're really asking has long been answered in the FAQ. If I'm not, there's a series of other pages on the wiki with relevant pointers.
"Certainly avoid yourself. He is a newbie and might not realize it. You'll hate his code deeply a few years down the road." - Sortie
[ My OS ] [ VDisk/SFS ]
User avatar
LieutenantHacker
Member
Member
Posts: 69
Joined: Sat May 04, 2013 2:24 pm
Location: Canada

Re: Anyone ever done "hello world" through a modern GPU?

Post by LieutenantHacker »

Funny because this page right here:

http://wiki.osdev.org/How_do_I_set_a_graphics_mode

Claims you need to write a device driver for an accelerated graphics card to get HD resolution.

You just told me otherwise about screen resolution. Ah is that article putting everything back into a circle again?

I don't need anything, I just want to know the facts and quit with the run arounds.

You tell me GPU does no drawing output; article tells me I need a device driver for HD output right after. I thought the framebuffer and resolution can be controlled indepdently of a GPU or accelerator expansion card? If so, I don't need any device driver and that article is wrong, and I could set resolution from MMIO addresses, and I'm limited to the framebuffer output itself.
The desire to hack, with the ethics to code.
I'm gonna build an 8-bit computer soon, with this as reference: http://www.instructables.com/id/How-to- ... -Computer/
User avatar
Combuster
Member
Member
Posts: 9301
Joined: Wed Oct 18, 2006 3:45 am
Libera.chat IRC: [com]buster
Location: On the balcony, where I can actually keep 1½m distance
Contact:

Re: Anyone ever done "hello world" through a modern GPU?

Post by Combuster »

LieutenantHacker wrote:http://wiki.osdev.org/How_do_I_set_a_graphics_mode

Claims you need to write a device driver for an accelerated graphics card to get HD resolution.
I think you have trouble reading.
"Certainly avoid yourself. He is a newbie and might not realize it. You'll hate his code deeply a few years down the road." - Sortie
[ My OS ] [ VDisk/SFS ]
User avatar
LieutenantHacker
Member
Member
Posts: 69
Joined: Sat May 04, 2013 2:24 pm
Location: Canada

Re: Anyone ever done "hello world" through a modern GPU?

Post by LieutenantHacker »

http://oi59.tinypic.com/kbyu4g.jpg

"However, if you want to support high resolutions, you must write a driver".

I think I read it good enough. It's also wrong in saying that the "alternative approach" to BIOS is a device driver. If you don't need an expansion card or GPU to output graphics (since the framebuffer controlling the CRTC should be etched onboard), why does that very article say the only other way is a device driver? It must be wrong.

And lastly, speaking of "frame buffers", this whole article on them doesn't make very good sense:

http://en.wikipedia.org/wiki/Framebuffer
The desire to hack, with the ethics to code.
I'm gonna build an 8-bit computer soon, with this as reference: http://www.instructables.com/id/How-to- ... -Computer/
User avatar
Owen
Member
Member
Posts: 1700
Joined: Fri Jun 13, 2008 3:21 pm
Location: Cambridge, United Kingdom
Contact:

Re: Anyone ever done "hello world" through a modern GPU?

Post by Owen »

A framebuffer is just an array of pixels.

Let us use my tablet as a hardware example. Actually, phones and tablets are pretty good examples because all the bits of video hardware can come from different vendors. For example, my tablet's framebuffer driver is by Samsung, but the GPU is an ARM Mali. They're both on one chip (the System on a Chip), but this needn't necessarily be the case (though for modern cards it is)

My tablet has a 2560x1504 display and therefore a framebuffer sized to contain 2560*1504=3850240 pixels. Given it uses a 32 bit per pixel framebuffer, that takes 15MiB.

A bit of hardware, called the "framebuffer driver", or often shortened to just "framebuffer", or called the "CRTC" (CRT controller) for legacy reasons, is responsible for generating a set of timing signals responsible for driving the display, and scanning out each of those pixels in turn to the display. Note that all it is doing is copying pixels from RAM.

The GPU executes a command stream generated by the GPU. It might take a command stream like
  • SET OutputBuffer Address=0xC0000000 Format=sRGBX8 Width=2560 Height=1504 RowPitch=(2560*4)
  • SET VertexShaderProgramPointer=0x12345678
  • SET FragmentShaderProgramPointer=0x23456789
  • SET VertexAttr0 Offset=0x00 Type=Vec4Float32
  • SET VertexAttr1 Offset=0x10 Type=Vec4Float32
  • DRAW VertexBuffer=0x34567890 Stride=0x20 Size=1000
(Or it might be very different. This is a hypothetical example)

You will observe the setting of vertex and fragment shader pointers here. This is telling the GPU what vertex shader "function" to invoke for every vertex, and what fragment shader "function" to invoke for every output fragment (a fragment is approximately a pixel but not quite). We then tell it where to put its output, where to gather the input data, what format it is in, and let it go.

The GPU doesn't drive the display. It just "throws out" an image for the framebuffer hardware to push out to the display at some indeterminate later time.

The two are decoupled. As I said, on my tablet, the SoC is made by Samsung, as is the framebuffer driver inside of it. However, the actual GPU is made by ARM (as is, incidentally, the CPU). I have a development board with a Texas Instruments SoC containing an Imagination PowerVR GPU and a Texus Instruments framebuffer. In the mobile world, people just "glob" building blocks together.

Which is what AMD, NVIDIA and Intel do when building their GPUs. However, they manufacture them as well. This means that, with small exceptions (certain Intel SoCs have a PowerVR GPU, for example), you get one chip with a CompanyX GPU and a CompanyX framebuffer controller on it. But the modules internally are separate. You can structure your driver with separate framebuffer and 3D acceleration functions.

What you will find is that the framebuffer is conceptually very similar but very different in minute details between the manufacturers, but within each manufacturer it varies relatively slowly over time.
User avatar
LieutenantHacker
Member
Member
Posts: 69
Joined: Sat May 04, 2013 2:24 pm
Location: Canada

Re: Anyone ever done "hello world" through a modern GPU?

Post by LieutenantHacker »

So in short - I can set any video output mode I want directly without touching any GPU or video card, by going through the framebuffer?

Then, as stated, the article I linked goes about the process wrong.
The desire to hack, with the ethics to code.
I'm gonna build an 8-bit computer soon, with this as reference: http://www.instructables.com/id/How-to- ... -Computer/
User avatar
thepowersgang
Member
Member
Posts: 734
Joined: Tue Dec 25, 2007 6:03 am
Libera.chat IRC: thePowersGang
Location: Perth, Western Australia
Contact:

Re: Anyone ever done "hello world" through a modern GPU?

Post by thepowersgang »

No, you need to program the CRTC (somehow). The 'driver' refered to in the article is a CRTC/framebuffer controller driver.

This driver might end up being also able to program the GPU (the specialsed processor for high-speed graphics manipulation), but it doesn't have to to just expose the framebuffer.
Kernel Development, It's the brain surgery of programming.
Acess2 OS (c) | Tifflin OS (rust) | mrustc - Rust compiler
Currently Working on: mrustc
embryo

Re: Anyone ever done "hello world" through a modern GPU?

Post by embryo »

LieutenantHacker wrote:So in short - I can set any video output mode I want directly without touching any GPU or video card, by going through the framebuffer?
The process of setting video mode is as follows:
You just write correct value to the particular video card register and then you have desired video mode. Do you see the GPU in the process?

GPU is an another CPU, just a processor. Does processor show any pixels anywhere?

GPU is used to calculate pixel data and the calculations are absolutely the same as in case of CPU. The only difference between GPU and CPU is the performance for particular set of operations. CPU does matrix operations with lesser speed, while GPU sucks when calculates 64-bit multiplication, for example.

The high resolution is connected to the GPU just because the highres most often used as a video output. What is a video output? It is a sequence of frames, that should be displayed on the screen. If we have all the frames ready we need no GPU and can just copy frames (mean bytes) to some predefined location where the video card can translate them into video signal. But if we have just a stream of mp4 video, then we have to decode this stream. Decoding is a process of transformation of the stream's bytes into another bytes. And such a transformation is the task for GPU, just because the GPU does it faster than CPU. After the transformation complete we still have just bytes and no pixels on the screen. Next, the video card should translate our bytes into video signals. The hardware on the video card, that translates the bytes, is very different from GPU. In short - the video card IS NOT GPU. But video card can have a GPU as a slave chip for performance improvement.

Another point - AMD's A10 processor contains GPU as it's part. And it is not a video card, but it is processor. The location of GPU unit is not very important when low performance graphics is required.

Once more - we need GPU just to speed things up. And we can go without GPU, but the speed of image transformation will be decreased. And even more (a bit of reverse) - we can ask GPU to do things that a traditional CPU is supposed to do. For example - 3d games use GPU for the physical world model calculation. It calculates speed and position of game objects, it calculates points of intersection of a bullet trajectory with the body of some monster, it calculates a trajectory of monster jump under low gravity conditions and so on. And, of course, the trajectory is not the pixels on the screen. We need to translate the trajectory to some sequence of images of a monster, positioned at different points of the screen. And this task is also accomplished by the GPU - it calculates positions for every pixel of a monster for every frame within a second. And then it calculates the colour for the monster pixels. And then it transfers the bytes of every pixel's colour to some video card memory location. And then the video card hardware translates the bytes into video device signals. And then a video device (like LCD monitor) shows the pixels to you. And all the mentioned calculations can be done using a traditional CPU without help from GPU, but slower.
User avatar
LieutenantHacker
Member
Member
Posts: 69
Joined: Sat May 04, 2013 2:24 pm
Location: Canada

Re: Anyone ever done "hello world" through a modern GPU?

Post by LieutenantHacker »

OEMs never seen to care for the word or inclusion of "video card"; I do not use the word (or prefer to) "video card" because it's an expansion card by default. Accessing the display directly is supposed to be legacy, and supported with no expansion card. Saying that you "write to a video card" brings us back to a GPU-related conundrum again:

1.I have never seen or heard of any modern computer system which advertises or even mentions in an OEM manual anything pertaining to "video cards" on the motherboard.

2.If writing to a video card register alters the CRTC, that means the CRTC or such can be manipulated without a "video card" at all.

3.What are those memory-mapped addresses for if they're on every x86-64 IBM-compatible motherboard? It would have to assume every engineer maps every address to the right location, regardless of hardware. Again, we're still unsettled here.

Does not make any sense.

You are telling me now that a "hypothetical expansion card" appears when I address MMIO, but when I need HD resolution J must live more than one life (as the article States) to get high definition (all while the GPU doesn't drive a video signal".

And then I have people telling me otherwise, which are also software engineers.

Thanks for nothing.
The desire to hack, with the ethics to code.
I'm gonna build an 8-bit computer soon, with this as reference: http://www.instructables.com/id/How-to- ... -Computer/
Locked