AMD Ryzen

jmc

Active member
Hope you can do a H264 vs X264 fps test?

Was thinking a Small Form Factor but the X300 chipset is not out till mid year.
So decided to replace my old Intel quad 94?? cpu with a Ryzen 700 8 core.

Ordered 16 GB of 3200 fast ram $176 ($40 off sale)... Wow, ram is no longer $5/GB! Had wanted 64GB.
Now only have to catch the motherboard and cpu on sale.
Should total up less then $700.

Then I'll do my own test and finally see if Ryzen gets the greater then twice the fps boost
with X264 that my 3930 6 cores gets.

jmc
 
Last edited:

jmc

Active member
Finally got my Ryzen 1800x/x370 MB up and running Win7-64. Had no trouble installing or finding latest drivers for all the MB devices.
Still waiting for my "free" cooler bracket update. Gave up and put on an old clip-on cooler for now - no overclocking, but it will handle encoding loads at stock 3600 GHZ.
Encoding a 1080i recording down to 720x400, VRD/h264 does use all 16 threads with a fairly uniform 50%-60% load on each. Had encoded the same file on the old FX-8370/AMD 970 system. The new Ryzen finished the same encode in half the time and twice the fps.
Temps stayed under 65C, but I didn't want to try encoding 2 at once until I can get a better cooler setup installed.



Don't know if you are overclocking at all.

https://www.pcper.com/news/Processors/PSA-AMD-XFR-Enabled-All-Ryzen-CPUs-X-SKUs-Have-Wider-Range

As I understand it...
The X Ryzen chips will increase only 100 mhz for ALL cores
and get the big turbo boost jump for TWO cores.

Example 1800X Base-3.6 Ghz, boost to 4Ghz (-only two cores)
All cores will get a 100mhz.(depending on temp.) (1700 non X will get a 50 mhz boost)

So if you want a big boost on ALL cores you must overclock!

Guess I'll find out if this is correct when my 1700 gets in.

jmc
 

Otter

Member
Don't know if you are overclocking at all.
Finally got the AM4 bracket for my Hyper 212 cooler. Tried some quick OC - don't do gaming, but wanted to see effect on encode times.
No problem hitting 3991.9 MHz (effectively 4GHZ) on all 8 cores. That with automatic voltage level. CPU temp was 56C-57C.
The core/thread load was even, but still less than 65%, even on second pass, meaning no improvement in encode time over 3.6GHZ.
Dropped it back to 3.7GHz and called it good. A double speed over my old FX Black will do nicely.
Ryzen 4GHz encode.JPG
As to H264 vs x264. I use a x264 GUI that exposes far more settings than VRD does for H264. Some of the x264 settings I use give me better quality than VRD, but mean significantly longer times. Any meaningful comparison would have to be done using the comparable settings for both encoders.
 

jmc

Active member
Finally got the AM4 bracket for my Hyper 212 cooler. Tried some quick OC - don't do gaming, but wanted to see effect on encode times.
No problem hitting 3991.9 MHz (effectively 4GHZ) on all 8 cores. That with automatic voltage level. CPU temp was 56C-57C.
The core/thread load was even, but still less than 65%, even on second pass, meaning no improvement in encode time over 3.6GHZ.
Dropped it back to 3.7GHz and called it good. A double speed over my old FX Black will do nicely.
View attachment 2239
As to H264 vs x264. I use a x264 GUI that exposes far more settings than VRD does for H264. Some of the x264 settings I use give me better quality than VRD, but mean significantly longer times. Any meaningful comparison would have to be done using the comparable settings for both encoders.
Heh, I'm also trying to get my "free" AM4 bracket for my newly ordered Hyper 212.
Cooler Master is out of them as of yesterday.
I have to click to them from my Newegg order form to get the "free" price.

Your speeds and temps look great! Hope my 1700 will do as well.
Hoping for 3.8 Ghz + or - depending on temp and voltage.
They say over 1.35v will lower lifespan with a max of 1.45v.

If I get usage in the 60% range I'll plan on running two encodes at once
like I use to when I used H264. That keeps my cpu in the 90% range.

Trying to remember some of the things that limit my cpu use.
Think that HD high bitrates drop the % use vs dvd and using the
L3 preset also lowers the % cpu usage.

X264...
I've got VRD Pro so I can just change the encoder from H264(default) to X264
in a profile to test with and I get > twice the fps!
Hope that happens with Ryzen.

Don't know if you have tried the Level 3 Preset in Advanced. (Level 2 is default)
Don't know what all it turns on but my L3 preset fps is half of L2 preset.

Standard dvd mpg to 1.3Mbps x264 mp4 files... L2 preset 230fps, L3 preset 120fps.
(6 core 3039 intel. 4.3 Ghz)

jmc
 

Otter

Member
RYZEN 9 update:

Just got my Ryzen 9 3900X up and running. CPU swap only took about 15 mins to pull the Gen1 1800X and install the Gen3 3900X. Getting the BIOS right took much longer - had to back up the Gigabyte X470 BIOS two releases, run a FW update patch for 4 DIMM slot compatibility, then flash next level and only then to latest release F31b. I expect Gigabyte will have more BIOS coming as they further tweak for Gen3.

Reran a MKV/h264 input file/project I ran yesterday on the 1800X 8c/16t. Same CPU/software encode, same MoBo, same memory, Samsung EVO 512GB NVME to EVO 1TB SSD.

Results with VRD 5-771a were as expected going from the 1800X 8c/16t to the 3900X 12c/24t - 54% higher fps encoding and total time was cut 1/3rd.
All 12 physical cores went to 4.1GHz and workload was about 75% on each. The 12 hyperthreads were also all active and evenly loaded, but at about 45%.

Was hoping for more based on the AMD claims for the new Gen3 and it's IPC improvements, but 54% is nothing to sneeze at.
 

jmc

Active member
Good to see some results. I knew the 3rd gen was so much better that it was time to upgrade.
I've heard that your 3900X will beat my first gen16 core threadripper!

"Results with VRD 5-771a" Would love to see some results with the beta VRD6 X264 encoder.

My Gigabyte 370 Mboard with a Ryzen1700 is listed ok for the 12 core 3900X (105 watts) but I'm hoping
the bios will be upgraded to handle the 16 core 3950X this Sept.
They are both 105 watts ...don't know how that happens.

It was worth while to go to the trouble to overclock my first gen threadripper but thankfully everything
I've been reading is that unless you just really enjoy the tweaking of overclocking...don't bother overclocking with the 3rd gen Ryzens.
You won't get much at all. A bit up here a bit down there.
I'm glad, I'll just hopefully pop in the 16 core 3950X this fall and just use it.

Now it does seem the "built in" Ryzen boosting that it does do is really helped the cooler you can keep it.
But it won't be anything big.

I'll be on my huge Noctua with a carbon fiber pad (within 2-3 degrees C of "top end" paste) per der8auer.
My "Carbonaut" pad was 5C cooler then the Artic silicone pad I was using. 67C down to 63C! Fans much quieter!

thanks,
jmc
 

Otter

Member
My Gigabyte 370 Mboard with a Ryzen1700 is listed ok for the 12 core 3900X (105 watts) but I'm hoping
the bios will be upgraded to handle the 16 core 3950X this Sept.
They are both 105 watts ...don't know how that happens.
My 3900X is basically the same as the upcoming 3950X you covet. Both have two "8-core" chiclets under the hood. Especially on a new process shrink, many dies come out with at least one core not performing up to specs.
AMD turns these core sections off and "bins" them as 12c/24t parts. When AMD stockpiles enough CPUs with 12c/24t (September), it will release the 3950X and not have to allocate or back-order.

As to the 105W TDP, there is no way 16 cores running 24 threads can use the same amount of power as 12c/16t AND run at the same speed. Yes the "Max Boost Clock" for the 3900X is 4.6GHz vs the 3950X at 4.7GHz, but you will not get all cores running at this speed at the same time on either processor.

Based on how good your cooling is, as the CPU heats up, cores will be throttled to lower speed to avoid melting your expensive toy. If your program gets all 16 cores going, they will generate more heat and have to run at a lower GHz than the 3900X's 12 cores.
I suspect thermal load will mean 16 cores all running at a constant 4.1GHz during a VRD encode vs 12 cores all running at 4.3GHz (what mine runs fully loaded using a Hyper 212 EVO)
That will be a question for you to answer in Sept unless some tech site runs comparisons sooner.

As nothing I do requires 12 OR 16 cores except video work, I couldn't justify the wait or the extra expense of the coming 3950X

Not sure which X370 board you have, but the AX370 Gaming supports Ryzen9 including X3900 from F40 BIOS on (CPU Support list just not updated yet)
DO follow the Gigabyte BIOS page instructions .
I was running F40 stable on my X470 board with the 1700X. BEFORE I installed the 3900X, I had to drop back to F31 BIOS, run the Gigabyte ECFwUpdate to "ensure 4 DIMM compatibility", then go to F40 BIOS.
Then also had to upgrade to F41a BIOS which fixes a drive-not-found issue booting from my Samsung 970 EVO NVME/M2 drive.
 

jmc

Active member
I suspect thermal load will mean 16 cores all running at a constant 4.1GHz during a VRD encode vs 12 cores all running at 4.3GHz (what mine runs fully loaded using a Hyper 212 EVO)
That will be a question for you to answer in Sept unless some tech site runs comparisons sooner.

As nothing I do requires 12 OR 16 cores except video work, I couldn't justify the wait or the extra expense of the coming 3950X
"Hyper 212 EVO" that's perfect as that is what is on my 1700@3.9Ghz.

Your 12 core should handle just about what ever you throw at it.
I never see 100% on my 16 cores except on a bench mark. X264 seems to use 10-11 cores for me.

I'm trying to enhance (X264) some digitized tapes( faded,fuzzy,washed out) and
I did try to pile on filters...color,contrast,"contour",precision deinterlace,VerySlow Preset to see what I could get out of my cpus. (even some 4k/60-on the 16core_HEVC)

My 3930 6core/12thread stayed at 98-99%(never seen that before)....39Fps
(Just a "normal" X264 encode would be 80ish% and around 120-130Fps)

My 16 core/16thread (no "HT" on) and X264 with the heavy load would run 80%+/-... 62Fps, 59% increase.

I did test HEVC and was amazed...25-30% is it on the bottom half of the Presets.
6 cores or 16 cores, 4-5 cores is it for HEVC "medium" and below-Presets.
Piling on all the extras above only made the encode take 10x longer...still only 25-30%.
("UltraFast" Preset did use around 8 threads on a simple mpg encode)

Just wanting the 16 cores so I can encode video and still have around 4 cores for "me"...web pages,hardware Youtubes etc.
I also record video on it and that can be really touchy. Hate having to patch out "glitches".

EDIT...Ryzen 3000 temp & Speed effects
""https://www.gamersnexus.net/news-pc/3492-ryzen-cpu-thermals-matter-coolers-and-cases""

Thanks for your thought!
jmc
 
Last edited:
Top Bottom