LINUX LITE 7.2 FINAL RELEASED - SEE RELEASE ANNOUNCEMENTS SECTION FOR DETAILS


Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
How to melt your processor?
#1
What particular digital activity can you name, that generates the highest prolonged use - and thus the heat - of either the CPU or the CPU and GPU both?

[Edited for stylistics]
Reply
#2
Well, gaming with a recent game at ultra graphics settings! Wink

I recently tried some Minecraft shadder (optifine) that made the geme look so real... it's awesome!
I was like, bah it's Minecraft, I have a good system, let's put Extreme settings.
The game then proceeded to load and display at 5 FPS, but, everything looked "real", water too.
Pretty shure it's GPU vesus CPU at something around 85% versus 15% though, hehe!
- TheDead (TheUxNo0b)

If my blabbering was helpful, please click my [Thank] link.
Reply
#3
(08-03-2019, 04:45 PM)TheDead link Wrote: Well, gaming with a recent game at ultra graphics settings! Wink

I recently tried some Minecraft shadder (optifine) that made the geme look so real... it's awesome!
I was like, bah it's Minecraft, I have a good system, let's put Extreme settings.
The game then proceeded to load and display at 5 FPS, but, everything looked "real", water too.
Pretty shure it's GPU vesus CPU at something around 85% versus 15% though, hehe!
I was pretty sure someone would mention high-end gaming, but I would actually ask whether it is a universal solution, meaning, can you really even launch a high-end game on a low-end system without simply getting an error msg right off the bat? Many times often the game would also simply crash if the machine does not meet the requirements for higher-end settings if applied. Eventually, it could overburden the system, but still the available resources may actually remain idle.

What about things that work regardless of the hardware?

What I know of, is participation in cloud computing, such as 'Berkeley Open Infrastructure Network Computing'. Had participated for a few years myself. The application is written to use any resources available, both in CPU and GPU. Another case, dynamic fractal rendering in fine quality. It is a more abstract example, since probably this would work better with some dedicated software basing on set alghoritm, but for an example, try generating a Mandelbrot fractal even in Pinta, set quality high, factor low and just keep changing the zoom. Fractals, ideally, should perpetually zoom in, live. Perhaps advanced video editing as well?
Reply
#4
(08-03-2019, 06:10 AM)MS link Wrote: What particular digital activity can you name, that generates the highest prolonged use - and thus the heat - of either the CPU or the CPU and GPU both?

[Edited for stylistics]

Back in the day (LOL) overclocking the CPU was what the cool kids used to do...
Fine tuning, hoping that the Fan would keep it cool enuff to go buck wild and turn it all the way up... Smile

Did it a few times.. on the Anthlon's (if I recall)

Oops, Did I date myself???
LL4.8 UEFI 64 bit ASUS E402W - AMD E2 (Quad) 1.5Ghz  - 4GB - AMD Mullins Radeon R2
LL5.8 UEFI 64 bit Test UEFI Kangaroo (Mobile Desktop) - Atom X5-Z8500 1.44Ghz - 2GB - Intel HD Graphics
LL4.8 64 bit HP 6005- AMD Phenom II X2 - 8GB - AMD/ATI RS880 (HD4200)
LL3.8 32 bit Dell Inspiron Mini - Atom N270 1.6Ghz - 1GB - Intel Mobile 945GSE Express  -- Shelved
BACK LL5.8 64 bit Dell Optiplex 160 (Thin) - Atom 230 1.6Ghz - 4GB-SiS 771/671 PCIE VGA - Print Server
Running Linux Lite since LL2.2
Reply
#5
[member=5414]firenice03[/member], not that long ago I used to speed up my weak GPU by 20% in clocking, yet having found the 'performance mode' thing on Ubuntu, I find no need to any longer, for the basic usage.
Reply
#6
Well... for the GPU, when you install Wine, you can run a small app called FurMark in "burn" mode.
I use version 1.8.5 since it's the most "retro" compatible with older GPU's because of the OpenGL version.
 
From what I'v seen all recent GPU related apps use the nVidia "CUDA" artchitecture or OpenCL / Vulkan.
Since CUDA started with the GeForce GT 430, if you have older cards then this, you'll have to use OpenGL.

CPU wise, you'll have to use another app though since, maybe in parallel. Wink
- TheDead (TheUxNo0b)

If my blabbering was helpful, please click my [Thank] link.
Reply
#7
.
Reply
#8
@'The Repairman', nice job! Uh, I guess. Nowadays, such thing would rather not happen, would it?
Reply
#9
(08-15-2019, 03:43 AM)The Repairman link Wrote: [quote author=firenice03 link=topic=6609.msg48189#msg48189 date=1564852498]
Back in the day (LOL) overclocking the CPU was what the cool kids used to do...
Fine tuning, hoping that the Fan would keep it cool enuff to go buck wild and turn it all the way up... Smile

Did it a few times.. on the Anthlon's (if I recall)

Oops, Did I date myself???
Did this for the hell of it just to see if it would really happen and it sure did.


Destructive yes it was.
Interesting yes it was.
Yes it was a lot of fun.

The good old days of computers.
[/quote]

LOL... I never pushed it THAT far, but yeah...


(08-15-2019, 06:14 AM)MS link Wrote: @'The Repairman', nice job! Uh, I guess. Nowadays, such thing would rather not happen, would it?

I would want to hope most "newer/modern" would have some type of Thermal throttling or the system would shutdown if it reached a close to critical temp.




LL4.8 UEFI 64 bit ASUS E402W - AMD E2 (Quad) 1.5Ghz  - 4GB - AMD Mullins Radeon R2
LL5.8 UEFI 64 bit Test UEFI Kangaroo (Mobile Desktop) - Atom X5-Z8500 1.44Ghz - 2GB - Intel HD Graphics
LL4.8 64 bit HP 6005- AMD Phenom II X2 - 8GB - AMD/ATI RS880 (HD4200)
LL3.8 32 bit Dell Inspiron Mini - Atom N270 1.6Ghz - 1GB - Intel Mobile 945GSE Express  -- Shelved
BACK LL5.8 64 bit Dell Optiplex 160 (Thin) - Atom 230 1.6Ghz - 4GB-SiS 771/671 PCIE VGA - Print Server
Running Linux Lite since LL2.2
Reply
#10
.
Reply


Forum Jump:


Users browsing this thread: 1 Guest(s)