Linux Support of Integrated Graphics Processing Units, 'Intel HD' vs. 'AMD Vega'

Hello,

I am planning on building a computer without a dedicated GPU that will be used as a "daily driver" which includes light gaming. How is the Linux support of the integrated graphics cards in the Intel and AMD processors these days? Does one vendor clearly has the advantage when it comes to integrated graphics drivers, etc?

I am planning on running stable Ubuntu releases on the machine.

πŸ‘︎ 9
πŸ’¬︎
πŸ“…︎ Mar 27 2019
🚨︎ report
Chipmaking giant Nvidia could potentially restart production of dedicated graphics processing units (GPUs) for cryptocurrency miners, according to its executive vice president and chief financial officer. coindesk.com/nvidia-may-r…
πŸ‘︎ 342
πŸ’¬︎
πŸ‘€︎ u/pale_blue_dots
πŸ“…︎ Jan 18 2021
🚨︎ report
Nvidia and AMD graphics cards drop to around 190% of their list price pcgamesn.com/amd-nvidia-g…
πŸ‘︎ 187
πŸ’¬︎
πŸ‘€︎ u/WorldlyCaregiver
πŸ“…︎ Jun 22 2021
🚨︎ report
Chinese miners are making bulk purchases of laptops featuring Nvidia's GeForce 30 series of graphics processing units to mine ETH okex.to/bsn-continues-glo…
πŸ‘︎ 2
πŸ’¬︎
πŸ‘€︎ u/aksinya_sidorova
πŸ“…︎ Feb 11 2021
🚨︎ report
VideoCardz: "ARM announces Mali-G710, G610, G510 and G310 graphics processing units" videocardz.com/press-rele…
πŸ‘︎ 164
πŸ’¬︎
πŸ‘€︎ u/Dakhil
πŸ“…︎ May 25 2021
🚨︎ report
VideoCardz: "ARM announces Mali-G710, G610, G510 and G310 graphics processing units" videocardz.com/press-rele…
πŸ‘︎ 29
πŸ’¬︎
πŸ‘€︎ u/Dakhil
πŸ“…︎ May 25 2021
🚨︎ report
Chipmaking giant Nvidia could potentially restart production of dedicated graphics processing units (GPUs) for cryptocurrency miners, according to its executive vice president and chief financial officer. coindesk.com/nvidia-may-r…
πŸ‘︎ 3
πŸ’¬︎
πŸ‘€︎ u/Tryotrix
πŸ“…︎ Jan 18 2021
🚨︎ report
VideoCardz: "ARM announces Mali-G710, G610, G510 and G310 graphics processing units" videocardz.com/press-rele…
πŸ‘︎ 4
πŸ’¬︎
πŸ‘€︎ u/Dakhil
πŸ“…︎ May 25 2021
🚨︎ report
DIAWAY #EDGEBOX is a 2U hybrid server rack unit available now! Processing power is provided by single-socket AMD EPYC 7002 Series Processors while storage is provided by a mix of HDD and NVMe drives. storagereview.com/news/di…
πŸ‘︎ 8
πŸ’¬︎
πŸ‘€︎ u/Luci_SR
πŸ“…︎ Jul 10 2020
🚨︎ report
Once graphics processing became too complicated for CPUs, the GPU was invented, but now GPUs can't render light all that well. Why haven't Light Processing Units been invented?

Lighting is without the doubt the most demanding task a GPU can do, and especially with Ray tracing.

My question is, why hasn't there been a dedicated processing unit to deal with it yet? Will there ever be? Is it feasible?

Please go in as much depth as possible.

I would imagine its not impossible, because we are capable of multiple GPU setups which is essentially what this would be.

Similarly, let's say we have 2 PUs and these were working in tandem, one rendered the scene, and one rendered the lighting. Scene, light, scene, light... obviously at 60fps this is jarring. But would this be achievable at 240fps? Or is that a speed too fast for modern clock cycles to synchronise? I imagine it would be so fast that you'd never be able to tell a blank screen with lighting is being presented before you and your brain would create a composite of the two.

Or how about a system similar to how 3d TVs work, so that one image is projected to the right eye, and another to the left, so that your brain melds the two. Actually, thats a pretty promising idea.

Additionally, let's say this is feasible for a second, who would you code it?

What would be your take on it?

What do you see the drawbacks being, or the technical barriers?

Oh and lastly, if a LPU were to exist, would it be like a GPU? Would the code be ran parallel, or would it benefit from CPU architecture?

Many thanks in advance for taking the time to look at this. I hope you found this an interesting topic to think about.

πŸ‘︎ 29
πŸ’¬︎
πŸ‘€︎ u/Dannyboi93
πŸ“…︎ Dec 13 2020
🚨︎ report
Dear β€œgame ray tracing developers”... Try to learn with Rockstar. A seven years old game can make this incredible reflection graphics and do not need any dedicated processing units. πŸ‘»
πŸ‘︎ 71
πŸ’¬︎
πŸ‘€︎ u/DarkwraithKnight
πŸ“…︎ Oct 29 2020
🚨︎ report
hi, i was trying to mine dogecoin with unmineable using my 6.2gb AMD Radeon R7 Graphics graphics card (using ethash algorithm) and it doesn't mine anything, i leave details here. I added apps and folders to the exceptions list and also to the firewall, but it doesn't work

I added apps and folders to the exceptions list and also to the firewall, but it doesn't work

Waiting 15 s for previous instance to close

Eth speed: 0.000 MH/s, shares: 0/0/0, time: 0:01

Eth speed: 0.000 MH/s, shares: 0/0/0, time: 0:01

Eth speed: 0.000 MH/s, shares: 0/0/0, time: 0:01

Eth: New job #ac6a2804 from etchash.unmineable.com:3333; diff: 4000MH

Eth: New job #733b4be0 from etchash.unmineable.com:3333; diff: 4000MH

GPU1: 71C

No CUDA driver found

Eth speed: 0.000 MH/s, shares: 0/0/0, time: 0:02

Unknown OpenCL driver version! Hashrate and stale shares may suffer

OpenCL platform: OpenCL 2.1 AMD-APP (3224.5)

Available GPUs for mining:

GPU1: AMD Radeon R7 Graphics (pcie 0), OpenCL 2.0, 4.6 GB VRAM, 6 CUs

Eth: the pool list contains 1 pool (1 from command-line)

Eth: primary pool: etchash.unmineable.com:3333

Starting GPU mining

GPU1: AMD driver 21.3.2

Eth: Connecting to ethash pool etchash.unmineable.com:3333 (proto: EthProxy)

GPU1: 64C

Unable to start CDM server at port 60080: Solo se permite un uso de cada direcciοΏ½n de socket (protocolo/direcciοΏ½n de red/puerto) (10048)

Eth: Connected to ethash pool etchash.unmineable.com:3333 (157.245.124.70)

Eth: New job #733b4be0 from etchash.unmineable.com:3333; diff: 4000MH

GPU1: Starting up... (0)

GPU1: Generating etchash light cache for epoch #211

Eth speed: 0.000 MH/s, shares: 0/0/0, time: 0:02

Eth: New job #6513dc0a from etchash.unmineable.com:3333; diff: 4000MH

Eth: New job #6513dc0a from etchash.unmineable.com:3333; diff: 4000MH

Eth speed: 0.000 MH/s, shares: 0/0/0, time: 0:00

Light cache generated in 3.8 s (11.0 MB/s)

GPU1: Using generic OpenCL kernels (device name 'Bristol Ridge')

GPU1: Free VRAM: 5.906 GB; used: 17179869182.707 GB

GPU1: Allocating DAG for epoch #211 (2.65) GB

GPU1: Generating DAG for epoch #211

Eth speed: 0.000 MH/s, shares: 0/0/0, time: 0:02

GPU1 not responding

Thread(s) not responding. Restarting.

πŸ‘︎ 13
πŸ’¬︎
πŸ‘€︎ u/Marrarr1
πŸ“…︎ May 07 2021
🚨︎ report
What are some units you like because of their artwork and/or skill graphics but are trash/underrated/low on the tier list?

I would say mine are Lily and Dahlia

πŸ‘︎ 63
πŸ’¬︎
πŸ“…︎ Oct 23 2020
🚨︎ report
Considerations of Amd Ryzen APU(Accelerated Processing Unit)

Who should buy?

People with a low budget and starting from scratch, this would benefit consumers by giving them the ability to buy their computer early to play. This allows the user to play games without the need of a dedicated graphics card, and to be able to buy a graphics card without the need to go through hassle of selling a used graphics card and upgrade as well. At around 300 usd you will be able to play Esports games comfortably. You expect to play 1080p low-medium on most games. Playing on an intel iGPU is not ideal and should not be done, Intel iGPU is mostly for watching videos. You should always aim to have 60fps(frames per second) to match the monitor refresh rate which usually 60hz. If you are already planning to buy a dedicated graphics card do not buy apu as you unable to to use the apu’s graphics and would switch to dedicated graphic card. It would be ideal to buy a better cpu, as you would able to afford it. Also apu’s have 8 less pcie lanes which would normally be used for the apu’s graphics. This is a ryzen 3 2200g build(330usd made on the same day as this post) https://pcpartpicker.com/list/PDPn9J Windows is excluded as you able to use linux for free https://www.scdkey.com/microsoft-windows-10-home-oem-cd-key-global_1379-20.html For cheap windows

Ryzen 3 2200g vs ryzen 5 2400g

Ryzen 5 has 4 threads clocked at 3.5ghz (3.9 boost), 192 stream processor/shaders clocked at 1250 mhz More compared to Ryzen 3’s 1100 mhz

Cpu wise, Ryzen 5 better for multi-threaded apps like 7 zip and multi-tasking. R5 has hyperthreading.

Game performance improvements seem to vary from between 7 to 20 per cent. https://www.eurogamer.net/articles/digitalfoundry-2018-ryzen-3-2200-g-ryzen-5-2400g-review

Ram As the Apu does not have video ram, it relies on ram clock speed heavily. The higher the clock speed the better for the CCX, which improves cpu performances and increases memory frequency for fps. https://www.anandtech.com/show/12621/memory-scaling-zen-vega-apu-2200g-2400g-ryzen

Please also note that Single channel memory does not provide enough memory bandwidth for the gpu, and requires dual channel memory. https://www.gamersnexus.net/guides/3244-amd-r5-2400g-memory-kit-benchmarks-and-single-vs-dual-channel

Motherboard

A320 Don’t have to deal with overclocking the cpu and it runs as specified. The default ram specification is 2933mhz so buying 3000mhz will run at 2933 by default. Anything less and the ram would run at the specification. This moth

... keep reading on reddit ➑

πŸ‘︎ 3
πŸ’¬︎
πŸ‘€︎ u/uglypotion
πŸ“…︎ Aug 20 2018
🚨︎ report
Here is a new AMD patent list, featuring some of the latest advances developed by AMD in architecture, graphics, packages and more. Follow the thread! (3/6) twitter.com/Underfox3/sta…
πŸ‘︎ 74
πŸ’¬︎
πŸ‘€︎ u/alex_stm
πŸ“…︎ Aug 07 2020
🚨︎ report
The game couldn't start because your Graphics Processing Unit (GPU) is not supported. What is this?? I have gtx 1080. I played the beta just fine.
πŸ‘︎ 211
πŸ’¬︎
πŸ‘€︎ u/devuzius
πŸ“…︎ Oct 25 2019
🚨︎ report
Global Supply of Graphics Processing Units Depleted Due to Cryptocurrency Mining Craze news.bitcoin.com/cryptocu…
πŸ‘︎ 49
πŸ’¬︎
πŸ‘€︎ u/JonyRotten
πŸ“…︎ Jun 05 2017
🚨︎ report
Lighting (with process list) for page 43 of graphic novel [OC] v.redd.it/fyh6kjhsoo671
πŸ‘︎ 63
πŸ’¬︎
πŸ“…︎ Jun 21 2021
🚨︎ report
Graphics Processing Unit(GPU), a brief overview

Welcome to r/SteinsTech,

GPU: A brief introduction to the functionality of a graphics card present in PCs and Mobiles, and the principles that it is based on.

https://drive.google.com/file/d/1DH9j6k-4hUM3tTvuz90dsCUgAUM43SjY/view?usp=sharing

πŸ‘︎ 2
πŸ’¬︎
πŸ‘€︎ u/Ravi17c
πŸ“…︎ Nov 23 2020
🚨︎ report
Here is a new AMD patent list, featuring some of the latest advances developed by AMD in architecture, graphics, packages and more. Follow the thread! (4/6) twitter.com/Underfox3/sta…
πŸ‘︎ 8
πŸ’¬︎
πŸ‘€︎ u/alex_stm
πŸ“…︎ Aug 11 2020
🚨︎ report
the combination of the GPU and the graphics processing unit
πŸ‘︎ 3
πŸ’¬︎
πŸ‘€︎ u/milkprogrammer
πŸ“…︎ Jun 13 2019
🚨︎ report
The Global Supply of Graphics Processing Units Is Depleted Due to Cryptocurrency Mining Craze news.bitcoin.com/cryptocu…
πŸ‘︎ 15
πŸ’¬︎
πŸ‘€︎ u/MemoryDealers
πŸ“…︎ Jun 05 2017
🚨︎ report
2014 - The year of the APU. AMD promises that by 2014, their APU platform will have 'heterogeneous' cores that will automatically decide between CPU and graphics processing in real-time.

This is by far the most interesting technological news I have read all year.

Currently, APUs have a CPU with graphics hardware alongside it. If AMD can pull this off, it will mean much faster processing for graphics and CPU alike, and more efficient power usage. These are my sources:

http://news.softpedia.com/news/AMD-Will-Have-Full-CPU-and-GPU-Fusion-in-2014-250416.shtml

http://www.anandtech.com/show/5493/amd-outlines-hsa-roadmap-unified-memory-for-cpugpu-in-2013-hsa-gpus-in-2014

http://www.dannzfay.com/2012/02/amd-will-have-full-cpu-and-gpu-fusion.html

http://www.xbitlabs.com/news/cpu/display/20120202102405_AMD_Promises_Full_Fusion_of_CPU_and_GPU_in_2014.html

http://i1-news.softpedia-static.com/images/news2/AMD-Will-Have-Full-CPU-and-GPU-Fusion-in-2014-3.jpg

Basically, this is how it is for the APU right now: I only need a little bit of CPU at the moment and as much GPU as possible (uses one CPU core and the entire GPU, which is about half of the entire die size of the APU). The rest of the CPU die space is simply not used, sitting idle. If the usage of one single part of the chip is maxed out, it can only use what the current component can supply, and no more. The other part can potentially sit idle or under-used.

APUs then: The same requirement as the first, but uses the entire chip die size for GPU, effectively doubling what the other APU would have been able to do. Not to mention, the transistors will be smaller in 2014 and the architectures will be newer and more efficient. This means, during CPU intensive tasks, the cores will be all allocated to that task, while the GPU can be scaled down while the task is running.

I dont know about you guys, but WOW. Never before has anything like this been done. This means that graphics cards as a whole will either be able to adopt the same technology and serve as a CPU, or they wont be needed at all (in most cases).

Sorry for the deletes and re-submissions... It keeps getting messed up and glitchy :(

πŸ‘︎ 3
πŸ’¬︎
πŸ‘€︎ u/TekTekDude
πŸ“…︎ Feb 07 2012
🚨︎ report
The Physically Unclonable Functions Found in standard PC Components Project, or PUFFIN, discovered that every graphics processing unit has a unique and defining set of characteristics. That means it might be useful for security authentication. h30565.www3.hp.com/t5/Fea…
πŸ‘︎ 79
πŸ’¬︎
πŸ‘€︎ u/wordsmithie
πŸ“…︎ Oct 29 2012
🚨︎ report
The Global Supply of Graphics Processing Units Is Being Depleted Due to Cryptocurrency Mining Craze news.bitcoin.com/cryptocu…
πŸ‘︎ 2
πŸ’¬︎
πŸ‘€︎ u/MemoryDealers
πŸ“…︎ Jun 05 2017
🚨︎ report
I sexually identify as an AMD Threadripper 2990WX 32 core 64 thread x86_64 socket TR4 central processing unit.

I sexually identify as an AMD Threadripper 2990WX 32 core 64 thread x86_64 socket TR4 central processing unit.

Ever since I was a boy I dreamed of achieving a higher IPC than Skylake X and crushing render speeds down to mere seconds by being a super threading beast. People say to me that a person being a 2990WX with 64 threads is Impossible and I’m fucking retarded but I don’t care, I’m powerful. I’m having GlobalFoundaries fabricate 12nm CCX dies onto my body.

From now on I want you guys to call me "Threadripper" and respect my right to crush Intel Skylake X HEDTs in Cinebench performance and power efficiency. If you can’t accept me you’re an Intel shill and need to check your rendering privilages. Thank you for being so understanding.

πŸ‘︎ 61
πŸ’¬︎
πŸ‘€︎ u/Fatal_Taco
πŸ“…︎ Oct 22 2018
🚨︎ report
What is the best book on the history of the Graphics Processing Unit (GPU)?

Ideally not too technical.

πŸ‘︎ 3
πŸ’¬︎
πŸ‘€︎ u/GraciousMeAndMine
πŸ“…︎ Mar 08 2018
🚨︎ report

Please note that this site uses cookies to personalise content and adverts, to provide social media features, and to analyse web traffic. Click here for more information.