Mining with AMD Ryzen 3 1200 Quad-Core Processor ...
Best mining CPU 2020: the best processors for mining ...
GameMax GM-1650 1650W 80 Plus Gold Power Supply with 14cm Fan - Black £85.80 at Amazon
The description of this deal was not provided by this subreddit and it's contributors. £85.80 - CCLOnline Product Description Game Max 12V ATX 1650w power SUPPLY used for bitcoin mining and ethereum ETH mining 8 GPU. A channel independent + 12V Output, power factor up to 1650w 80 Plus Gold high conversion efficiency, The typical load efficiency up to 90%. A large 140mm fan structure, intelligent temperature control, large air volume, ultra mute function mature APFC + and double forward technology, the most green and environmentally friendly power source: high power, high efficiency and full RoHS craft radiation. The applicability of the strongest power: full power input, support multiple high-power quad-core CPU's, support for 6 graphics card, high-performance motherboard graphics cards and other devices. key features efficiency exceed 88% at 220V (Gold 90% Designed basis) thermal control technology protection circuitry offer: OVP/ OPP/ SCP/ OCP/ UVP/ otp 100% burn-in under ambient temperature up to 45 degrees 100% high voltage test under AC 1.5Kv 10mA 3 seconds standby power Less than 1W, low power consumption ripple and silent. Troubleshooting steps:
Please make sure that you once you have connected the mains power cable, that you switch the manual power button to 1 from 0 located next to the power input.
Make sure the 4+4 Pin 12V connection is plugged into either the 4-PIN or 8-PIN CPU Power Input on your motherboard, in addition to the main 20+4 Pin Connector. This will be 4-Pin or 8-Pin dependant on your motherboard. Your system will appear not to work without this connected.
Check AC power input. Make sure the cord is firmly seated in the wall socket and in the power supply socket. Try a different cord.
Check DC power connections. Make sure the motherboard and disk drive power connectors are firmly seated and making good contact. Check for loose screws.
Check installed peripherals. Remove all boards and drives and retest the system. If it works, add back in items one at a time until the system fails again. The last item added before the failure returns is likely defective. Box Contains Power Supply Screws Features & details Efficiency exceed 88% at 220V (Gold 90% designed basis) Thermal control technology Protection circuitry offer: OVP / OPP / SCP / OCP / UVP / OTP 100% burn-in under ambient temperature up to 45 degrees 100% high voltage test under AC 1.5KV 10mA 3 Seconds Rate Power : 1536Watt Product information GM 1650W Brand GameMax Product Dimensions 15 x 18 x 8.5 cm; 2.85 Kilograms Item model number GM-1650 Manufacturer Game Max Color Black Wattage 850 watts Are Batteries Included No Item Weight 2.85 kg Additional Information ASIN: B075SK5S5K Customer Reviews: 4.4 out of 5 stars 805Reviews Best Sellers Rank: 39,664 in Computers & Accessories (See Top 100 in Computers & Accessories) 321 in Power Supplies Date First Available: 19 Sept. 2017 Warranty & Support
Amazon.com Return Policy: Regardless of your statutory right of withdrawal, you enjoy a 30-day right of return for many products. For exceptions and conditions, see Return details. This deal can be found at hotukdeals via this link: https://ift.tt/3mpvluJ
Mistyped the title... This is going to be a simple guide to help any R1 owner upgrade and optimize their Alpha.
(In order of importance) Storage Unit: HDD OUT SSD IN This is by far the easiest upgrade to make and the most effective. https://www.newegg.com/p/pl?N=100011693%20600038463 Any of those will work, just needs to be 2.5 Inch SATA. How to Replace Video WIFI Card: This is like a 5-15$ upgrade. Go find any Intel 7265ngw off eBay and replace it with your current WIFI card. If you don’t want to buy used then here. How to Replace Video RAM: Ram prices have tanked because of bitcoin mining, so this has become quite a cheap upgrade as well. I’d recommend 16GB just because why not, but if your tight on cash 8GB is fine. https://www.newegg.com/p/pl?N=100007609%20601190332%20601342186%20600000401&Order=BESTMATCH How to Replace Video CPU: This required the most research. I’d recommend you look through this first. The wattage of the processor slot only ranges from 35w-50w according to a developer of the Alpha (Source). The socket type is LGA 1150. If you’re going cheap, the i5-4590t (35w) and i5-4690s (65w) are both great options. i5-4590t i5-4690s The i5-4690t (45w) is also great but is hard to find from a trustworthy source for a reasonable price. If your willing to spend $100+ then easily the i7-4790t (45w). That is probably the best processor to put in the Alpha. All 45w will be used giving you 3.9 GHz Turbo. The T series apparently runs the best on the R1 according to This Reddit post. How to Replace Video GPU: Coming Soon! Maxed out Alpha R1 specs: i7-4790t, 1TB Samsung SSD, 16GB DDR3, Nvidia Geforce GTX 860m. (Upgrading to anything better then that is pointless)
Optimizing the Alpha R1
1st Completely wipe the computer
Just a good place to start, gets rid of Hivemind and other aging programs.
Anything in my current, fairly old (but water-cooled!), PC worth using in a new one?
I started building computers around the year 2000 and have never really done a complete build from scratch (for myself) after my first. I'd upgrade a part here and there, and over time everything has been replaced multiple times. However, I'm thinking, due to an upgrade hiatus (it took me a LONG time to "beat" Skyrim :-P), I'm at the end of the road. I'm close to the conclusion that, for the second time in my life, it makes sense for a fresh new build. I figure I'd run this past y'all first. My next computer I'll use for both fun and work. On the fun side, it would ideally play modern games (Particularly, I'm eying Elder Scrolls VI and Baldurs Gate III) on decent settings on my 34" widescreen monitor. Work-wise, it needs to be able to run multiple docker containers and let me do other things (take notes in notion, google docs, etc.) while on a CPU-crushing video call. The budget is $1,500. Here is my current setup and thoughts on each component: Photos:https://imgur.com/a/xyM07dx Things that may be useful: Operating System: Windows 10 Professional (from upgrading from Windows 7... the DVD is hopefully somewhere) PSU: Corsair TX850W - It has been trusty for the last eight years, but may not have the needed connectors for today's stuff. Hard Drive:Crucial MX100 512 GB SATA SSD - 2.5-Inch, No performance complaints (specs claim 6.0 Gb/s), although I'm running out of storage space. Optical Drive: Pioneer DVD-RW - Do people still put these in new computers? I also have an external USB DVD drive I could use in a pinch. Case:Chieftec Dragon Mid Tower - this old case is steel and heavy as shit, which is actually nice as my dogs and toddlers are unlikely to knock it over inadvertently. It has a window which I like, although cable management is a massive pain in the ass. I'm not too fond of the door that covers the buttons and optical drive and lost it long ago. Cooling:Custom water cooling setup - I water-cooled in 2002, overclocking my Athlon XP 1700+ from 1.4Ghz to 2.5. It was awesome. The radiator and T-valve are the original gangsters. I'm on my fifth pump, with my last three being the Swiftech MCP655-B, which I like. The current water block is some D-Tek for the old CPU socket. The radiator is an old Chevy Impala radiator (I think) that this guy I met on a 3DMark (now Futuremark) forum (jb2cool?) custom modified and made a shroud that houses two 120mm fans. I had to drill the shit out of my case to mount this thing in there. I'm very nostalgic about this setup, but it would also be a huge pain to fit into a new case. Monitor:LG 34UM67-P 34 - 34" IPS widescreen; 5ms 2560 x 1080 60hz; is 60hz too slow these days? Keyboard and mouse: Logitech Chordless Wave - USB dongle; wrists feel ok, no complaints Things that probably will not be useful: Motherboard:Gigabyte P45T-ES3G - I'm pretty sure I won't be reusing this. I also bought it to replace a more bad-ass motherboard that died when my previous power supply died and took it out with it. I do like how it had dual bios, though. CPU: Intel Core 2 Quad Q6600 - Been impressed with this CPU lasting as long as it has. I wet sanded it down to a mirror finish ready to overclock the shit out of it, but then never got to it as life got in the way. Memory: 4x4GB PC3-12800 DDR3 - G.Skill Ripjaws; ancient technology. Note: I want more than 16GB ram in my next build. GPU:Asus Geforce GTX 460 - My previous GTX 460 died at the height of bitcoin, and any modern GPU was stupidly expensive. Replacing mine was only $30 on eBay, so that's the route I went. tl;dr: are any of the above bolded components still worthwhile in a modern PC build?
OSRS currently uses a CPU renderer straight out of 2003
It's really REALLY bad! At least, by modern standards. It could not be more opposite to what modern computers pursue. It's not Jagex's fault, it's just old... Very VERY old! It's a huge undertaking, and Jagex has been too busy knocking mobile absolutely out of the park, and I'd do the same if I were them - so don't think this is some kind of rag on Jagex. Anyways, some may be surprised that this renderer is still managing to hurt computers today. How can software first written in 2003-2004 (FOR COMPUTERS OF THAT ERA) be laggy and stuttery on computers today? The answer is simple: resizable mode, and individual CPU core speed. Resizable mode takes a game window that used to be 765x503 (the majority of which used to be a fixed GUI canvas, but not with the new mode!) and renders it at resolutions as high as 3840x2160, maybe even higher. Do you know how many pixels that is? Over 8 million. Do you know how many pixels the original renderer was designed to expect? Just under 390,000. That's over 21x the work being thrown at modern CPUs. Cores aren't anywhere near 21x faster than they were at the close of the single-core era, which is why players with 4k monitors need to see therapists after long play sessions. Surely CPUs have gotten faster since the mid 2000s! They have, but not quite in the way that a single-threaded(single core) CPU renderer would expect... CPU manufacturers have been focusing on power draw, temperatures, core count, and special architectural improvements like GPU integration and controller integration. Comparatively, improving individual core speed hasn't been as much of a focus as it had been prior to the multi-core era -and no, I'm not talking about the useless gigahertz(TM) meme measurement, I'm talking about actual overall work done by the core. As a result, the CPUs we have today have developed down a much different path than what this CPU renderer would benefit from. Not nearly the amount that resizable mode demands. Especially considering these CPU cores were designed to assume that things didn't pile all their work onto just one core. We're throwing over 21x the work at CPUs that, in most cases, have only been getting 5-15% faster per-core performance every year.
What is a "frame"?
Think of a frame as a painting. Your GPU renderer (or CPU cough cough) is responsible for using your GPU to paint an empty canvas, and turn it into a beautiful and complete picture. First, it draws the skybox(if there is one, it's gonna just fill with black in the case of OSRS). Then, it draws all the visible geometry from back to front, with all the lighting and effects. Then, it draws the GUI elements over the top. It does everything, one pixel at a time. Its job is to draw these paintings as quickly as possible (ideally, so you perceive movement) and present them to your monitor, one at a time, forever... until you close the game. Think of a GPU renderer as a talented artist with hundreds of arms (GPU cores). If your GPU is able to paint this picture in 16.6 milliseconds (frame time measurements are always in milliseconds), then you'll have a frame rate of 60 frames per second, as 1000 ms / 16.6 is 60. Sometimes your renderer struggles, though. Sometimes it can only complete a frame in 100 milliseconds (10FPS). You can't wave a magic want when this happens. If you want a higher framerate, you need to either update your hardware, or change your software. By change software, I mean either make it more efficient at the work it's told to do, or give it less work. RuneLite has done the former. An example of the latter would be lowering resolution, turning graphical details down, turning off filtering, etc. Games usually call this set of controls the "Graphics settings". Luckily, OSRS is so lightweight it will likely never need a graphics settings menu. (Think of a CPU renderer as a painter with no artistic ability and, in the case of quad core, four arms...but he's only allowed to paint with one, while the other 3 sit idle. Also, he has to constantly stop painting to return to his normal duties! No fun! The CPU is better off at its own desk, letting the GPU handle the painting.)
A GPU renderer improves frame rates
Not that this matters currently, as the game is capped at 50FPS anyways... but it's still going to be huge for low-end systems or high-end systems with high res monitors. There's also the future, though... Once a GPU renderer is out, it could be possible that they could someday uncap the framerate (which, according to mod atlas, is only the character's camera as all animations are 2FPS anyways). I expect that an update like this will make fixed mode a solid 50FPS on literally everything capable of executing the game. Fixed mode was already easy to run on everything except for old netbooks and Windows Vista desktops, so this really wouldn't be a surprise.
A GPU renderer improves frame times
Frame times are just as important as frame rates. Your frame rate is how many frames are drawn over the course of a second. But, as described previously, each "painting" is done individually. Sometimes the painter takes longer to do something! What if there's a glowing projectile flying past the camera, or something else momentary that's intensive? The painter has to take the time to paint that, resulting in a handful of frames over the course of that second taking much more time than the others. When your frame rate is high and frame times are consistent, this is perceived as incredibly smooth motion. Ideally, all of our frames are completed in the same amount of time, but this isn't the case. Sometimes "distractions" will come up, and cause the painter to devote an extra 10-20ms to it before returning to the rest of the painting. In bad scenarios, this actually becomes visible, and is referred to as micro stutter. Having a dedicated GPU renderer doing the work ensures this is very uncommon. A GPU has hundreds or thousands of cores. If some get distracted, others reach out and pick up the workload. Everything is smooth, distributed, and uninterrupted. You may recall Mod Atlas talking about frame times when he posted about his GPU renderer last year: https://twitter.com/JagexAtlas/status/868131325114552321 Notice the part where he says it takes 25+ms on the CPU, but only takes 4-5ms on the GPU! That's 200-250 frames per second, if the framerate were uncapped! Also, side note: Just because a frame is completed in 1ms doesn't always mean your framerate will be 1000FPS. If your framerate is capped, then the painter will sit and wait after completing and presenting a frame until it's time to start painting again. This is why capping your framerate can be good for power usage, as demonstrated on mobile! Your GPU can't suck up your battery if it's asleep 90% of the time!
A GPU renderer is more efficient
Instead of piling all computational workloads and graphical workloads onto one single CPU core (rest in peace 8+ core users), a GPU renderer takes graphical work off the CPU and does it itself. I'd estimate the majority of all the work was graphical, so this will make a pretty noticeable difference in performance, especially on older systems. Before, having OSRS open while using other software would have a noticeable performance impact on everything. Especially on older computers. Not anymore! CPUs will run cooler, software will run better, and your computer may even use less power overall, since GPUs are much better at efficient graphical work than CPUs are!
All computers are already equipped to run this very VERY well
Most of the computers we have today are designed with two things: a good GPU, and an okay CPU. This isn't 2003 anymore. GPUs have made their way into everything, and they're prioritized over CPUs. They're not used just for games anymore, entire operating systems rely on them not just for animations and graphical effects, but entire computing tasks. GPUs are responsible for everything from facial recognition to Bitcoin mining these days. Not having a good one in your computer will leave you with a pretty frustrating experience - which is why every manufacturer makes sure you have one. Now, thanks to RuneLite, these will no longer be sitting idle while your poor CPU burns itself alive.
This new GPU renderer will make OSRS run much better on low end systems
Low end systems are notorious for having garbage like Intel Atom or Celeron in them. Their GPU is alright, but the CPU is absolutely terrible. Using the GPU will give them a boost from 5-15FPS in fixed mode, to around 50. At least, assuming they were made after the GPGPU revolution around 2010.
This new GPU renderer will make OSRS run much better on high end systems
High end systems tend to have huge GPUs and huge monitors. Right now, your GPU is asleep while your 4k monitor brings the current CPU renderer to its knees, on the verge of committing sudoku. Letting your GPU take on all that work will make your big and beautiful monitor handle OSRS without lag or stutter.
This new GPU renderer will open the possibility of plugins that build on top of it
One that comes to mind is a 2x/3x/4x GUI scaler. Scaling things in a graphics API is much easier than scaling it in some convoluded custom CPU renderer that was first designed to run in Internet Explorer 5.
It's easier to customize graphical variables in a GPU renderer than it is a glitchy old CPU renderer
Want night time? Change the light intensity. Want cel-shaded comic book appearance for some stupid reason? It's easy. Want to hit 60FPS on a Raspberry Pi? Change your render distance to 2 tiles. Now that the graphical work has been offloaded to a graphics API that's been literally designed to easily modify these things, the sky is the limit. See my past posts on this topic:
Big round of applause for the RuneLite team, and Jagex for allowing them to continue development. Without RuneLite, OSRS would be half the game it is today. Here's to their continued success, with or without Jagex integrating their code into the main game!
Just wanted to let you guys know that I'm successfully running a (pruned) Bitcoin node + TOR on a $11.99 single board computer (Rock Pi S). The SBC contains a Rockchip RK3308 Quad A35 64bit processor, 512MB RAM, RJ45 Ethernet and USB2 port and I'm using a 64GB SDCard. It runs a version of Armbian (410MB free). There's a new version available that even gives you 480MB RAM, but I'm waiting for Bitcoin Core 0.19 before upgrading. To speed things up I decided to run Bitcoin Core on a more powerful device to sync the whole blockchain to an external HDD. After that I made a copy and ran it in pruned mode to end up with the last 5GB of the blockchain. I copied the data to the SD card and ran it on the Rock Pi S. After verifying all blocks it runs very smoothly. Uptime at the moment is 15 days. I guess you could run a full node as well if you put in a 512GB SDcard. The Rock Pi S was sold out, but if anybody is interested, they started selling a new batch of Rock Pi S v1.2 from today. Screenshot of resources being used Bitcoin Core info Around 1.5 GB is being transferred every day --- Some links and a short How to for people that want to give it a try:
Set up UFW Firewall sudo ufw default deny incoming sudo ufw default allow outgoing sudo ufw allow ssh # we want to allow ssh connections or else we won’t be able to login. sudo ufw allow 8333 # port 8333 is used for bitcoin nodes sudo ufw allow 9051 # port 9051 is used for tor sudo ufw logging on sudo ufw enable sudo ufw status
Add user Satoshi so you don't run the Bitcoin Core as root sudo adduser satoshi --home /home/satoshi --disabled-login sudo passwd satoshi # change passwd sudo usermod -aG sudo satoshi # add user to sudo group
Once again I am asking for your assistance. Years ago you helped me build my 1st PC. It is time to upgrade. Included pictures of battle station and other questions.
Hi everyone. My build is starting to show its age. The more I try to do, the more I see its age. at the bottom of the post you can see my current build. What is your intended use for this build? The more details the better.
Video/Picture Editing (Adobe Suite)
Streaming, Gaming (FPS/Action 1st player games)
killing time on Reddit, programming
If gaming, what kind of performance are you looking for? (Screen resolution, framerate, game settings)
Screen resolution: Was thinking 4K, but I saw this youtube video that said 4K monitors are not worth it.
Refresh rate: Want the best available. But also know I am very ignorant. 240hz?
Game settings: Would like to get the best looking game possible.
What is your budget (ballpark is okay)?
$1800 + $1200 (stimulus check...maybe if it ever gets here) = $3000
I also need to buy new monitors. so budget needs to account for that.
The rules had a link to this post about future proofing. Is that still the case. It is about 9 years old. Should I just build something every two years and buy cheap parts? That seems so wasteful.
I hate my current monitors. See pic for the reason. I need help picking a monitor. I do know that I want a widescreen. Just don't know if two or one is the dream setup. MUST WORK WITH MOUNT ARM
Corona Virus and its affect on the market. Some said that prices would go down. Others say it will go up. My hope is to have a prototype ready, and if there is a sale, I can snatch it up.
With my old parts once I upgrade. What does everyone do with them? Do you make a server? use your old graphics card to mine for bitcoin? I need Ideas. Might just build my wifey a computer with my old parts.
I don't buy new laptops and when I do I try and get the most out of my graphics. Before you AMD ass lickers ban me, I like AMD. If I was going to Build a pc it would at least have an AMD cpu and maybe an 5700XT. I bought myself an Alienware,32gb ram, I7 6th gen and gtx 1070 for £600. Why I N T E L and N V I D I A? Easy answer. A Full AMD machine is S H I T. I can still see AMD fanboys saying "OMG FX were still good". NO! You can call me whatever you want but I like the better side. I liked Intel until this year which is when AMD really took over. Anyways I got sidetracked there. I only have one pc (bitcoin miner) with a GTX 1080 and I N T E L 2 quad (mining is gpu but not cpu intensive) and AMD does a very bad job in mining. So I only use a laptop. I'm going to change my laptop in about 3 years and it will be a 2/3 year old but still capable laptop. SO DOES THAT MEAN ITS GOING TO AMD? Cause you know AMD is the best. Here I'm going to dissapoint you. I have to be on the go and I cant have a pc. I never said I like AMD LAPTOP cpu's. In 3 years I'm getting a laptop with a 6 core cpu. Sadly AMD doesn't offer you 6 cores. BUT WAIT 7NM!!! Nope 12nm. BUT...... BUT ITS CHEAPER. I don't care the laptop is going to be cheap anyway. All higher end AMD cpu's have 4 cores. Here I want to start a discution and petition to have 6/8 core mobile Ryzen cpu's. And you know since AMD likes pushing make 12 core mobile cpu's.
Curiosity/Motivation/Logic and why stablecoins are the future
From the Prohashing mining pool forums, at https://forums.prohashing.com/viewtopic.php?f=11&t=6428: ----------------------------- In my last post, I showed why my confidence in there being more than one more bubble is too low to justify remaining heavily invested in cryptocurrencies. In this article I want to expand upon that reasoning by talking a little bit about human factors that lead me to believe that stablecoins pose a great risk to traditional cryptocurrencies. Defining CML People differ in a number of ways, and they express all sorts of personality traits. However, in my interactions with people in all areas of life, I've noticed that one characteristic seems to differentiate people more than any other. I'll refer to this characteristic as "CML" throughout the rest of this post, as the best way I was able to describe it is a sequence of curiosity, motivation, and logic. People who exhibit this trait use those three steps to evaluate and act when faced with most situations, while people who do not exhibit this trait fail to do so. An overwhelming majority of people do not possess the "CML" trait and its absence increasingly hinders their abilities to understand and succeed in the world as technology and social structures become increasingly complex. Here are a few examples of common scenarios people face in life.
At lunch, a co-worker discusses a movie that he's seen and talks about some aspect of it that he really enjoyed. A high-CML person might search for the movie on Google to read a synopsis of it, or will watch the movie himself to learn more. Low-CML people will nod and politely respond to what the other person is saying, then give it no further thought.
An error message appears in an application a person uses often. A high-CML person might search for or ask someone for information about the error message to get at least a cursory understanding of what the problem could be, or to figure out the best way to avoid it without understanding it. A low-CML person will often state "I didn't go to college, so I can't understand this" or "it's beyond my capabilities" and will believe the error is unsolvable.
A person just got laid off from a job. A high-CML person might create a spreadsheet listing his options, which in addition to finding a new job would include different possibilities like traveling on savings, working less and cutting back on expenses, or starting a business. A low-CML person would likely start firing off resumes to companies immediately without considering all the options first, because that's what society told him to do.
A person is asked to go skydiving. A high-CML person would say "yes," or would ask questions about skydiving before declining, because high-CML people evaluate risks and try new experiences. A low-CML person would decline immediately because skydiving is perceived as dangerous and that's not what he does.
A person buys a new gadget. A high-CML person sets up the gadget so that it is configured correctly for their home. A low-CML person turns on the gadget without looking to see if there is an easier way to do it, and then wastes hours over the years performing complex routines over and over again to work around whatever is wrong for them with the default configuration.
Xthinner/Blocktorrent development status update -- Jan 12, 2018
Edit: Jan 12, 2019, not 2018. Xthinner is a new block propagation protocol which I have been working on. It takes advantage of LTOR to give about 99.6% compression for blocks, as long as all of the transactions in the block were previously transmitted. That's about 13 bits (1.6 bytes) per transaction. Xthinner is designed to be fault-tolerant, and to handle situations in which the sender and receiver's mempools are not well synchronized with gracefully degrading performance -- missing transactions or other decoding errors can be detected and corrected with one or (rarely) two additional round trips of communication. My expectation is that when it is finished, it will perform about 4x to 6x better than Compact Blocks and Xthin for block propagation. Relative to Graphene, I expect Xthinner to perform similarly under ideal circumstances (better than Graphene v1, slightly worse than Graphene v2), but much better under strenuous conditions (i.e. mempool desynchrony). The current development status of Xthinner is as follows:
Detailed informal writeup of the encoding scheme -- done 2018-09-29
Modify TxMemPool to allow iterating on a view sorted by TxId -- done 2018-11-26
Basic C++ segment encoder -- done 2018-11-26
Basic c++ segment decoder -- done 2018-11-26
Checksums for error detection -- done 2018-12-09
Serialization/deserialization -- done 2018-12-09
Prefilled transactions, coinbase handling, and non-mempool transactions -- done 2018-12-25
Missing/extra transactions, re-requests, and handling mempool desynchrony for segment decoding -- done 2019-01-12
Block transmission coupling the block header with one or more Xthinner segments -- 50% done 2019-01-12
Missing/extra transactions, re-requests, and handling mempool desynchrony for block decoding -- done 2019-01-12
Integration with Bitcoin ABC networking code
Networking testing on regtest/testnet/mainnet with real blocks
Write BIP/BUIP and formal spec
Bitcoin ABC pull request and begin of code review
Unit tests, performance tests, benchmarks -- started
Bitcoin Unlimited pull request and begin of code review
Alpha release of binaries for testing or low-security block relay networks
Merging code into ABC/BU, disabled-by-default
Complete security review
Enable by default in ABC and/or BU
(Optional) parallelize encoding/decoding of blocks
Following is the debugging output from a test run done with coherent senderecipient mempools with a 1.25 million tx block, edited for readability:
Testing Xthinner on a block with 1250003 transactions with sender mempool size 2500000 and recipient mempool size 2500000 Tx/Block creation took 262 sec, 104853 ns/tx (mempool) CTOR block sorting took 2467 ms, 987 ns/tx (mempool) Encoding is 1444761 pushBytes, 2889520 1-bit commands, 103770 checksum bytes total 1910345 bytes, 12.23 bits/tx Single-threaded encoding took 2924 ms, 1169 ns/tx (mempool) Serialization/deserialization took 1089 ms, 435 ns/tx (mempool) Single-threaded decoding took 1912314 usec, 764 ns/tx (mempool) Filling missing slots and handling checksum errors took 0 rounds and 12 usec, 0 ns/tx (mempool) Blocks match! *** No errors detected
If each transaction were 400 bytes on average, this block would be 500 MB, and it was encoded in 1.9 MB of data, a 99.618% reduction in size. Real-world performance is likely to be somewhat worse than this, as it's not likely that 100% of the block's transactions will always be in the recipient's mempool, but the performance reduction from mempool desychrony is smooth and predictable. If the recipient is missing 10% of the sender's transactions, and has another 10% that the sender does not have, the transaction list is still able to be successfully transmitted and decoded, although in that case it usually takes 2.5 round trips to do so, and the overall compression ratio ends up being around 71% instead of 99.6%. Anybody who wishes can view the WIP Xthinner code here. Once Xthinner is finished, I intend to start working on Blocktorrent. Blocktorrent is a method for breaking a block into small independently verifiable chunks for transmission, where each chunk is about one IP packet (a bit less than 1500 bytes) in size. In the same way that Bittorrent was faster than Napster, Blocktorrent should be faster than Xthinner. Currently, one of the big limitations on block propagation performance is that a node cannot forward the first byte of a block until the last byte of the block has been received and completely validated. Blocktorrent will change that, and allow nodes to forward each IP packet shortly after that packet was received, regardless of whether any other packets have also been received and regardless of the order in which the packets are received. This should dramatically improve the bandwidth utilization efficiency of nodes during block propagation, and should reduce the block propagation latency for reaching the full network quite a lot -- my current estimate is about 10x improvement over Xthinner. Blocktorrent achieves this partial validation of small chunks by taking advantage of Bitcoin blocks' Merkle tree structure. Chunks of transactions are transmitted in a packet along with enough data from the rest of the Merkle tree's internal nodes to allow for that chunk of transactions to be validated back to the Merkle root, the block header, and the mining PoW, thereby ensuring that packet being forwarded is not invalid spam data used solely for a DoS attack. (Forwarding DoS attacks to other nodes is bad.) Each chunk will contain an Xthinner segment to encode TXIDs My performance target with Blocktorrent is to be able to propagate a 1 GB block in about 5-10 seconds to all nodes in the network that have 100 Mbps connectivity and quad core CPUs. Blocktorrent will probably perform a bit worse than FIBRE at small block sizes, but better at very large blocksizes, all without the trust and centralized infrastructure that FIBRE uses.
Geez, I'm middle aged now. Help me tweak my build.
I have a toddler now, and I can't completely re-educate myself for 2019 parts, so I'm hoping you all can help me figure out if/how I'm going wrong, or if I can get better value for money.
My rig from 2010 is on its last legs, and I'm looking to replace. (In case anyone's nostalgic, it's an i5-760//4GB//Radeon HD5800)
I don't want anyone to build this for me - I'm just looking for advice. My budget is in the 600-900 range. I'm not looking to max-out my budget, but I'd love to know if there are places where I can get better value for money. I'm in Connecticut. Keyboard, mouse, speakers, and monitor are separate. I'm fine on my own with that.
Use Case - probably pretty light. Productivity: I'd like to be able to either dual-monitor or use a 4k (not both at the same time). Gaming: Single monitor, 1080P, and not necessarily the latest and greatest. I'm a patient gamer, and considering my next game will be Axiom Verge or something N64-era. Overclocking is not expected. My toddler may end up using it for awhile, but I'm sure he'll need a new one by the time he's 8 or so.
Am I wasting money with both a 2200G AND dedicated graphics? If so, will the CPU be enough for me, or ought I go with a different chip/card combo?
I kind of guessed at the Graphics card, based on the stickied builds and number of reviews. I'm happy to hear suggestions.
Do I need a separate cooleheatsink? (judging from the stickied builds, no?) Does the CPU come with a stock one?
I could save a few bucks with a 0.25TB SSD. Meh, I'll go with 0.5TB.
I could probably go for 8GB memory, but 16GB may make it last longer.
HDD reliability is pretty important to me. Any insights on manufacturer reputations are very welcome.
Boring cases are fine.
Thanks to all for your help!
Edit: Learning to format.
Edit 2: Guys, GUYS! There's been a lot of really good suggestions here. Thanks to everyone.
But we're not maxing out the budget for the sake of it. Check out the use case - or the title! I gave up on current-level graphics and FPS games some time ago. I'm not paying $250 for a graphics card (or competing with BitCoin miners).
In fact, how far could I downgrade my graphics card, and still hit my targets for desktop apps (and still be able to do much older games)? For games, let's target: "I could ably play MineCraft without gameplay problems, but the graphics might be mid-level."
Upgraded the processor - thanks, dar! The 2600X was only $10 extra so I went with that. Changed the Storage solution to a SSD. - thanks lild. Changed PSU. - thanks lild. Removed Windows 10 - will look into that, but it's not something that needs to go into compatibility/performance discussions.
Technical Cryptonight Discussion: What about low-latency RAM (RLDRAM 3, QDR-IV, or HMC) + ASICs?
The Cryptonight algorithm is described as ASIC resistant, in particular because of one feature:
A megabyte of internal memory is almost unacceptable for the modern ASICs.
EDIT: Each instance of Cryptonight requires 2MB of RAM. Therefore, any Cryptonight multi-processor is required to have 2MB per instance. Since CPUs are incredibly well loaded with RAM (ie: 32MB L3 on Threadripper, 16 L3 on Ryzen, and plenty of L2+L3 on Skylake Servers), it seems unlikely that ASICs would be able to compete well vs CPUs. In fact, a large number of people seem to be incredibly confident in Cryptonight's ASIC resistance. And indeed, anyone who knows how standard DDR4 works knows that DDR4 is unacceptable for Cryptonight. GDDR5 similarly doesn't look like a very good technology for Cryptonight, focusing on high-bandwidth instead of latency. Which suggests only an ASIC RAM would be able to handle the 2MB that Cryptonight uses. Solid argument, but it seems to be missing a critical point of analysis from my eyes. What about "exotic" RAM, like RLDRAM3 ?? Or even QDR-IV?
QDR-IV SRAM is absurdly expensive. However, its a good example of "exotic RAM" that is available on the marketplace. I'm focusing on it however because QDR-IV is really simple to describe. QDR-IV costs roughly $290 for 16Mbit x 18 bits. It is true Static-RAM. 18-bits are for 8-bits per byte + 1 parity bit, because QDR-IV is usually designed for high-speed routers. QDR-IV has none of the speed or latency issues with DDR4 RAM. There are no "banks", there are no "refreshes", there are no "obliterate the data as you load into sense amplifiers". There's no "auto-charge" as you load the data from the sense-amps back into the capacitors. Anything that could have caused latency issues is gone. QDR-IV is about as fast as you can get latency-wise. Every clock cycle, you specify an address, and QDR-IV will generate a response every clock cycle. In fact, QDR means "quad data rate" as the SRAM generates 2-reads and 2-writes per clock cycle. There is a slight amount of latency: 8-clock cycles for reads (7.5nanoseconds), and 5-clock cycles for writes (4.6nanoseconds). For those keeping track at home: AMD Zen's L3 cache has a latency of 40 clocks: aka 10nanoseconds at 4GHz Basically, QDR-IV BEATS the L3 latency of modern CPUs. And we haven't even begun to talk software or ASIC optimizations yet.
CPU inefficiencies for Cryptonight
Now, if that weren't bad enough... CPUs have a few problems with the Cryptonight algorithm.
AMD Zen and Intel Skylake CPUs transfer from L3 -> L2 -> L1 cache. Each of these transfers are in 64-byte chunks. Cryptonight only uses 16 of these bytes. This means that 75% of L3 cache bandwidth is wasted on 48-bytes that would never be used per inner-loop of Cryptonight. An ASIC would transfer only 16-bytes at a time, instantly increasing the RAM's speed by 4-fold.
AES-NI instructions on Ryzen / Threadripper can only be done one-per-core. This means a 16-core Threadripper can at most perform 16 AES encryptions per clock tick. An ASIC can perform as many as you'd like, up to the speed of the RAM.
CPUs waste a ton of energy: there's L1 and L2 caches which do NOTHING in Cryptonight. There are floating-point units, memory controllers, and more. An ASIC which strips things out to only the bare necessities (basically: AES for Cryptonight core) would be way more power efficient, even at ancient 65nm or 90nm designs.
QDR-IV and RLDRAM3 still have latency involved. Assuming 8-clocks of latency, the naive access pattern would be:
This isn't very efficient: the RAM sits around waiting. Even with "latency reduced" RAM, you can see that the RAM still isn't doing very much. In fact, this is why people thought Cryptonight was safe against ASICs. But what if we instead ran four instances in parallel? That way, there is always data flowing.
Cryptonight #1 Read
Cryptonight #2 Read
Cryptonight #3 Read
Cryptonight #4 Read
Cryptonight #1 Write
Cryptonight #2 Write
Cryptonight #3 Write
Cryptonight #4 Write
Cryptonight #1 Read #2
Cryptonight #2 Read #2
Cryptonight #3 Read #2
Cryptonight #4 Read #2
Cryptonight #1 Write #2
Cryptonight #2 Write #2
Cryptonight #3 Write #2
Cryptonight #4 Write #2
Notice: we're doing 4x the Cryptonight in the same amount of time. Now imagine if the stalls were COMPLETELY gone. DDR4 CANNOT do this. And that's why most people thought ASICs were impossible for Cryptonight. Unfortunately, RLDRAM3 and QDR-IV can accomplish this kind of pipelining. In fact, that's what they were designed for.
As good as QDR-IV RAM is, its way too expensive. RLDRAM3 is almost as fast, but is way more complicated to use and describe. Due to the lower cost of RLDRAM3 however, I'd assume any ASIC for CryptoNight would use RLDRAM3 instead of the simpler QDR-IV. RLDRAM3 32Mbit x36 bits costs $180 at quantities == 1, and would support up to 64-Parallel Cryptonight instances (In contrast, a $800 AMD 1950x Threadripper supports 16 at the best). Such a design would basically operate at the maximum speed of RLDRAM3. In the case of x36-bit bus and 2133MT/s, we're talking about 2133 / (Burst Length4 x 4 read/writes x 524288 inner loop) == 254 Full Cryptonight Hashes per Second. 254 Hashes per second sounds low, and it is. But we're talking about literally a two-chip design here. 1-chip for RAM, 1-chip for the ASIC/AES stuff. Such a design would consume no more than 5 Watts. If you were to replicate the ~5W design 60-times, you'd get 15240 Hash/second at 300 Watts.
Depending on cost calculations, going cheaper and "making more" might be a better idea. RLDRAM2 is widely available at only $32 per chip at 800 MT/s. Such a design would theoretically support 800 / 4x4x524288 == 95 Cryptonight Hashes per second. The scary part: The RLDRAM2 chip there only uses 1W of power. Together, you get 5 Watts again as a reasonable power-estimate. x60 would be 5700 Hashes/second at 300 Watts. Here's Micron's whitepaper on RLDRAM2: https://www.micron.com/~/media/documents/products/technical-note/dram/tn4902.pdf . RLDRAM3 is the same but denser, faster, and more power efficient.
Hybrid Cube Memory
Hybrid Cube Memory is "stacked RAM" designed for low latency. As far as I can tell, Hybrid Cube memory allows an insane amount of parallelism and pipelining. It'd be the future of an ASIC Cryptonight design. The existence of Hybrid Cube Memory is more about "Generation 2" or later. In effect, it demonstrates that future designs can be lower-power and give higher-speed.
The overall board design would be the ASIC, which would be a simple pipelined AES ASIC that talks with RLDRAM3 ($180) or RLDRAM2 ($30). Its hard for me to estimate an ASIC's cost without the right tools or design. But a multi-project wafer like MOSIS offers "cheap" access to 14nm and 22nm nodes. Rumor is that this is roughly $100k per run for ~40 dies, suitable for research-and-development. Mass production would require further investments, but mass production at the ~65nm node is rumored to be in the single-digit $$millions or maybe even just 6-figures or so. So realistically speaking: it'd take ~$10 Million investment + a talented engineer (or team of engineers) who are familiar with RLDRAM3, PCIe 3.0, ASIC design, AES, and Cryptonight to build an ASIC.
Current CPUs waste 75% of L3 bandwidth because they transfer 64-bytes per cache-line, but only use 16-bytes per inner-loop of CryptoNight.
Low-latency RAM exists for only $200 for ~128MB (aka: 64-parallel instances of 2MB Cryptonight). Such RAM has an estimated speed of 254 Hash/second (RLDRAM 3) or 95 Hash/second (Cheaper and older RLDRAM 2)
ASICs are therefore not going to be capital friendly: between the higher costs, the ASIC investment, and the literally millions of dollars needed for mass production, this would be a project that costs a lot more than a CPU per-unit per hash/sec.
HOWEVER, a Cryptonight ASIC seems possible. Furthermore, such a design would be grossly more power-efficient than any CPU. Though the capital investment is high, the rewards of mass-production and scalability are also high. Data-centers are power-limited, so any Cryptonight ASIC would be orders of magnitude lower-power than a CPU / GPU.
EDIT: Greater discussion throughout today has led me to napkin-math an FPGA + RLDRAM3 option. I estimated roughly ~$5000 (+/- 30%, its a very crude estimate) for a machine that performs ~3500 Hashes / second, on an unknown number of Watts (Maybe 75Watts?). $2000 FPGA, $2400 RLDRAM3, $600 on PCBs, misc chips, assembly, etc. etc. A more serious effort may use Hybrid Cube Memory to achieve much higher FPGA-based Hashrates. My current guess is that this is an overestimate on the cost, so -30% if you can achieve some bulk discounts + optimize the hypothetical design and manage to accomplish the design on cheaper hardware.
Vertnode - An automated solution for installing Vertcoin node(s) on Single Board Computers
Hello Vertcoin Community, Eager to contribute to the Vertcoin Community I began creating step by step walkthrough guides on how to get a Vertcoin node up and running on a Raspberry Pi, Raspberry Pi Zero and Intel NUC. Along with information to get a Vertcoin node up and running was also optional steps to install p2pool-vtc. I decided that while this step by step guide might be helpful to a few, a setup script may prove to be useful to a wider range of people. I have this script to a point where I think it may be productive to share with a bigger audience, for those who are brave and have this hardware sitting around or like to tinker with projects; I invite you to test this setup script if you are interested, if you run into errors any sort of verbose console output of the error proves to be extremely helpful in troubleshooting. The script was designed to produce a “headless” server... meaning we will not be using a GUI to configure Vertcoin or check to see how things are running. In fact, once the server is set up, you will only interact with it using command line calls over SSH. The idea is to have this full node be simple, low-power, with optimized memory usage and something that “just runs” in your basement, closet, etc. Why run a headless node on a Single Board Computer?
You want to support vertcoin. Running a node makes the network more robust and able to serve more wallets, more users, and more transactions.
You are building or using applications such as mining that must validate transactions according to vertcoin’s consensus rules.
You are developing vertcoin software and need to rely on a vertcoin node for programmable (API) access to the network and blockchain.
The idea is to have this full node be simple, low-power, with optimized memory usage and something that “just runs” in your basement, closet, etc. Required: USB Flash Drive 6GB - 32GB Please note that the script was designed for Single Board Computers first and looks for an accessible USB Flash Drive to use for storing the blockchain and swap file, as constant writing to a microSD can degrade the health of the microSD. Supports
Raspberry Pi 3 B+ | ARM Cortex-A53 1.4GHz | 1GB SRAM |
Raspberry Pi Zero (W) | Single Core ARMv6 1 Ghz | 433MB RAM |
All of the hardware listed above is hardware that I have personally tested / am testing on myself. The plan is to continue expanding my arsenal of single board computers and continue to add support for more hardware to ensure as much compatibility as possible. Functionality
Installs Vertcoin full node to Single Board Computer
Installs p2pool-vtc (Optional)
Installs LIT and LIT-AF (Optional)
It is worth noting that LIT can be ran with multiple configurations, the ones displayed in the Post Installation Report reflect values that run LIT with the Vertcoin Mainnet. Please be aware that the Vertcoin Testnet chain has not been mined 100% of the time in the past, if you make transactions on the Vertcoin testnet that do not go through it is likely because the chain has stopped being mined. BE CAREFUL WITH YOUR COINS, ONLY TEST WITH WHAT YOU ARE OKAY WITH LOSING IF YOU USE THE MAINNET.
Recommended: Use Etcher to install the chosen OS to your microSD card / USB flash drive.
If you intend on installing Ubuntu Server 16.04 to your Intel NUC please use Etcher to install the .iso to your USB flash drive. https://etcher.io/ PLEASE NOTE THIS SCRIPT MAY GIVE AN ERROR. THIS IS THE NATURE OF TESTING. PLEASE REPORT YOUR ERRORS IF YOU WANT THEM TO BE FIXED/RESOLVED. THANK YOU FOR BETTERING THE DEVELOPMENT OF THIS SCRIPT.
You can use different clients to ssh into your node. One option is using PuTTY or Git Bash on Windows which is included in the desktop version of Git. If you are using Linux you can simply open a new terminal window and ssh to the IP address of your node (hardware you intend installing the Vertcoin node on). You will need to know the IP address of your node, this can be found on your router page. ssh 192.168.1.5 -l pi For example, this command uses ssh to login to 192.168.1.5 using the -l login name of pi. The IP address of your node will likely be different for you, in this example I am logging into a Raspberry Pi which has a default login name of pi. A brief list of commands that can be used to check on the Vertcoin node status: vertcoin-cli getblockchaininfo | Grab information about your blockchain vertcoin-cli getblockcount | Grab the current count of blocks on your node vertcoin-cli getconnectioncount | Grab the current count of connections to your node. A number of connections larger than 8 means that you have incoming connections to your node. The default settings are to make 8 outgoing connections. If you want incoming connections please port forward your Raspberry Pi in your Router settings page. vertcoin-cli getpeerinfo | Grab the information about the peers you have connected to / are connected to vertcoin-cli getnettotals | Grab network data, how much downloaded/upload displayed in bytes tail -f ~/.vertcoin/debug.log | Output the latest lines in the Vertcoin debug.log to see verbose information about the Vertcoin daemon (ctrl+c to stop) Thank you to all who have helped me and inspired me thus far, @b17z, @jamesl22, @vertcoinmarketingteam, @canen, @flakfired, @etang600, @BDF, @tucker178, @Xer0 This work is dedicated to the users of Vertcoin, thank you for making this possible. 7/20/2018 Thank you @CommodoreAmiga for the incredibly generous tip <3 You can reach me @Sam Sepiol#3396 on the Vertcoin Discord, here on reddit or @ [email protected]
Budget ~£400, but is flexible. If there is a good deal for more or less then great!
Open to buying used parts - I think retired bitcoin mining GPUs can be coming on sale cheap now?
I'm not going to be doing any video editing or rendering or anything.
I think I want to be upgrading my GPU and SSD, but I'm not sure if this would mean that I need up update my mobo/psu/cooler too. If something else is the bottleneck here, or you need any other information, please let me know! My current setup looks like this: PCPartPicker Part List
But there might be a problem with resource usage... Let's say I own a lot of bitcoins and I do not want Ethereum to exist. So I'll run multiple high-performance, clustered nodes and use them to process transactions which will consume as much resources as possible. Soon running Ethereum nodes requires 1 TB of RAM. People say: "What the fuck? Clearly making scripts Turing-complete was a bad idea". And Ethereum is abandoned as a broken project... (Few people can afford to run full nodes, so it is as good as centralized.) This attack might costs many millions USD, but if that helps to protect my Bitcoin investment, it makes sense.
Note that this was written before any details on Ethereum were settled, just general thoughts based on Ethereum's idea of running "Turing-complete scripts". So it looks like this kind of a scenario is unfolding now, 2.5 years after I've written then comment:
September 18, 2016: All geth nodes crash due to an out of memory bug. A specially crafted block makes geth, the most popular Ethereum node software, to request huge amounts of RAM, and thus crash. According to some reports, 85% of all Ethereum nodes are running Geth at the time. All of them were crashing, services (and wallets) which relied on them couldn't function.
September 22: "Today the network was attacked by a transaction spam attack that repeatedly called the EXTCODESIZE opcode (see trace sample here), thereby creating blocks that take up to ~20-60 seconds to validate due to the ~50,000 disk fetches needed to process the transaction. The result of this was a ~2-3x reduction in the rate of block creation while the attack was taking place; there was NO consensus failure". Ethereum blocks should normally appear each ~15 seconds, but they take ~20-60 seconds to validate. Thus a normal node just couldn't keep up with blocks. Thankfully, miners got slowed down too, so there was "NO consensus failure" this time.
September 25: "attacker has changed strategy ... Basically, it's now a quadratic memory complexity attack but using CALL instead of EXTCODESIZE. However because the gas limit is only 1.5m, the effect is lower, so geth nodes are just running more slowly and not crashing outright. "
On my nodes, I'm seeing up to 16 GiB of virtual memory being used. This crashed one of my nodes twice, since it only had 8 GiB of RAM and 2 GiB of swap. I added more swap space, and that seems to have helped the crashing. I also changed the db cache size according to the blog post recommendations, and I'm now making it through the attack blocks in about 5 seconds on that machine. My other server has 16 GiB of RAM and a 4.4 GHz quad-core CPU, and it makes it through the attack blocks in about 2-3 seconds. Both have SSDs and are running Parity 1.3. With geth, some of these blocks take up to 2 minutes to verify.
So it seems like fairly decent server-class hardware is necessary to keep up with the Ethereum blockchain now. If you run the heavily optimized Ethereum implementation, Parity. Ethereum devs try to mitigate the issue by recommending miners to increase transaction fees (gas price) and reduce block size (gas limit). This could hurt apps/users, if there were any. Now, this attack isn't going to kill Ethereum, of course. It's more like a warning. The cost of the attack is estimated to be on the scale of $5000 per day, so it's not some kind of largescale attempt to kill Ethereum. I think things could be much worse if an attacker also had an access to significant amounts of mining hashpower: this would have allowed him to mine huge blocks at zero cost. Also Ethereum node hardware requirements might grow due to demands of legitimate applications.
Console gaming is hardly different from PC gaming, and much of what people say about PC gaming to put it above console gaming is often wrong.
I’m not sure about you, but for the past few years, I’ve been hearing people go on and on about PCs "superiority" to the console market. People cite various reasons why they believe gaming on a PC is “objectively” better than console gaming, often for reasons related to power, costs, ease-of-use, and freedom. …Only problem: much of what they say is wrong. There are many misconceptions being thrown about PC gaming vs Console gaming, that I believe need to be addressed. This isn’t about “PC gamers being wrong,” or “consoles being the best,” absolutely not. I just want to cut through some of the stuff people use to put down console gaming, and show that console gaming is incredibly similar to PC gaming. I mean, yes, this is someone who mainly games on console, but I also am getting a new PC that I will game on as well, not to mention the 30 PC games I already own and play. I’m not particularly partial to one over the other. Now I will mainly be focusing on the PlayStation side of the consoles, because I know it best, but much of what I say will apply to Xbox as well. Just because I don’t point out many specific Xbox examples, doesn’t mean that they aren’t out there.
“PCs can use TVs and monitors.”
This one isn’t so much of a misconception as it is the implication of one, and overall just… confusing. This is in some articles and the pcmasterrace “why choose a PC” section, where they’re practically implying that consoles can’t do this. I mean, yes, as long as the ports of your PC match up with your screen(s) inputs, you could plug a PC into either… but you could do the same with a console, again, as long as the ports match up. I’m guessing the idea here is that gaming monitors often use Displayport, as do most dedicated GPUs, and consoles are generally restricted to HDMI… But even so, monitors often have HDMI ports. In fact, PC Magazine has just released their list of the best gaming monitors of 2017, and every single one of them has an HDMI port. A PS4 can be plugged into these just as easily as a GTX 1080. I mean, even if the monitoTV doesn’t have HDMI or AV to connect with your console, just use an adaptor. If you have a PC with ports that doesn’t match your monitoTV… use an adapter. I don’t know what the point of this argument is, but it’s made a worrying amount of times.
“On PC, you have a wide range of controller options, but on console you’re stuck with the standard controller."
Are you on PlayStation and wish you could use a specific type of controller that suits your favorite kind of gameplay? Despite what some may believe, you have just as many options as PC. Want to play fighting games with a classic arcade-style board, featuring the buttons and joystick? Here you go! Want to get serious about racing and get something more accurate and immersive than a controller? Got you covered. Absolutely crazy about flying games and, like the racers, want something better than a controller? Enjoy! Want Wii-style motion controls? Been around since the PS3. If you prefer the form factor of the Xbox One controller but you own a PS4, Hori’s got you covered. And of course, if keyboard and mouse it what keeps you on PC, there’s a PlayStation compatible solution for that. Want to use the keyboard and mouse that you already own? Where there’s a will, there’s a way. Of course, these aren’t isolated examples, there are plenty of options for each of these kind of controllers. You don’t have to be on PC to enjoy alternate controllers.
“On PC you could use Steam Link to play anywhere in your house and share games with others.”
PS4 Remote play app on PC/Mac, PSTV, and PS Vita. PS Family Sharing. Using the same PSN account on multiple PS4s/Xbox Ones and PS3s/360s, or using multiple accounts on the same console. In fact, if multiple users are on the same PS4, only one has to buy the game for both users to play it on that one PS4. On top of that, only one of them has to have PS Plus for both to play online (if the one with PS Plus registers the PS4 as their main system). PS4 Share Play; if two people on separate PS4s want to play a game together that only one of them owns, they can join a Party and the owner of the game can have their friend play with them in the game. Need I say more?
“Gaming is more expensive on console.”
Part one, the Software This is one that I find… genuinely surprising. There’s been a few times I’ve mentioned that part of the reason I chose a PS4 is for budget gaming, only to told that “games are cheaper on Steam.” To be fair, there are a few games on PSN/XBL that are more expensive than they are on Steam, so I can see how someone could believe this… but apparently they forgot about disks. Dirt Rally, a hardcore racing sim game that’s… still $60 on all 3 platforms digitally… even though its successor is out.
See my point? Often times the game is cheaper on console because of the disk alternative that’s available for practically every console-available game. Even when the game is brand new. Dirt 4 - Remember that Dirt Rally successor I mentioned?
Yes, you could either buy this relatively new game digitally for $60, or just pick up the disk for a discounted price. And again, this is for a game that came out 2 months ago, and even it’s predecessor’s digital cost is locked at $60. Of course, I’m not going to ignore the fact that Dirt 4 is currently (as of writing this) discounted on Steam, but on PSN it also happens to be discounted for about the same amount. Part 2: the Subscription Now… let’s not ignore the elephant in the room: PS Plus and Xbox Gold. Now these would be ignorable, if they weren’t required for online play (on the PlayStation side, it’s only required for PS4, but still). So yes, it’s still something that will be included in the cost of your PS4 or Xbox One/360, assuming you play online. Bummer, right? Here’s the thing, although that’s the case, although you have to factor in this $60 cost with your console, you can make it balance out, at worst, and make it work out for you as a budget gamer, at best. As nice as it would be to not have to deal with the price if you don’t want to, it’s not like it’s a problem if you use it correctly. Imagine going to a new restaurant. This restaurant has some meals that you can’t get anywhere else, and fair prices compared to competitors. Only problem: you have to pay a membership fee to have the sides. Now you can have the main course, sit down and enjoy your steak or pasta, but if you want to have a side to have a full meal, you have to pay an annual fee. Sounds shitty, right? But here’s the thing: not only does this membership allow you to have sides with your meal, but it also allows you to eat two meals for free every month, and also gives you exclusive discounts for other meals, drinks, and desserts. Let’s look at PS Plus for a minute: for $60 per year, you get:
2 free PS4 games, every month
2 free PS3 games, every month
1 PS4/PS3 and Vita compatible game, and 1 Vita-only game, every month
Exclusive/Extended discounts, especially during the weekly/seasonal sales (though you don’t need PS Plus to get sales, PS Plus members get to enjoy the best sales)
access to online multiplayer
So yes, you’re paying extra because of that membership, but what you get with that deal pays for it and then some. In fact, let’s ignore the discounts for a minute: you get 24 free PS4 games, 24 free PS3 games, and 12 Vita only + 12 Vita compatible games, up to 72freegames every year. Even if you only one of these consoles, that’s still 24 free games a year. Sure, maybe you get games for the month that you don’t like, then just wait until next month. In fact, let’s look at Just Cause 3 again. It was free for PS Plus members in August, which is a pretty big deal. Why is this significant? Because it’s, again, a $60 digital game. That means with this one download, you’ve balanced out your $60 annual fee. Meaning? Every free game after that is money saved, every discount after that is money saved. And this is a trend: every year, PS Plus will release a game that balances out the entire service cost, then another 23 more that will only add icing to that budget cake. Though, you could just count games as paying off PS Plus until you hit $60 in savings, but still. All in all, PS Plus, and Xbox Gold which offers similar options, saves you money. On top of that, again, you don't need to have these to get discounts, but with these memberships, you get more discounts. Now, I’ve seen a few Steam games go up for free for a week, but what about being free for an entire month? Not to mention that; even if you want to talk about Steam Summer Sales, what about the PSN summer sale, or again, disc sale discounts? Now a lot of research and math would be needed to see if every console gamer would save money compared to every Steam gamer for the same games, but at the very least? The costs will balance out, at worst. Part 3, the Systems
Xbox and PS2: $299
Xbox 360 and PS3: $299 and $499, respectively
Xbox One and PS4: $499 and $399, respectively.
Rounded up a few dollars, that’s $1,000 - $1,300 in day-one consoles, just to keep up with the games! Crazy right? So called budget systems, such a rip-off. Well, keep in mind that the generations here aren’t short. The 6th generation, from the launch of the PS2 to the launch of the next generation consoles, lasted 5 years, 6 years based on the launch of the PS3 (though you could say it was 9 or 14, since the Xbox wasn’t discontinued until 2009, and the PS2 was supported all the way to 2014, a year after the PS4 was released). The 7th gen lasted 7 - 8 years, again depending on whether you count the launch of the Xbox 360 to PS3. The 8th gen so far has lasted 4 years. That’s 17 years that the console money is spread over. If you had a Netflix subscription for it’s original $8 monthly plan for that amount of time, that would be over $1,600 total. And let’s be fair here, just like you could upgrade your PC hardware whenever you wanted, you didn’t have to get a console from launch. Let’s look at PlayStation again for example: In 2002, only two years after its release, the PS2 retail price was cut from $300 to $200. The PS3 Slim, released 3 years after the original, was $300, $100-$200 lower than the retail cost. The PS4? You could’ve either gotten the Uncharted bundle for $350, or one of the PS4 Slim bundles for $250. This all brings it down to $750 - $850, which again, is spread over a decade and a half. This isn’t even counting used consoles, sales, or the further price cuts that I didn’t mention. Even if that still sounds like a lot of money to you, even if you’re laughing at the thought of buying new systems every several years, because your PC “is never obsolete,” tell me: how many parts have you changed out in your PC over the years? How many GPUs have you been through? CPUs? Motherboards? RAM sticks, monitors, keyboards, mice, CPU coolers, hard drives— that adds up. You don’t need to replace your entire system to spend a lot of money on hardware. Even if you weren’t upgrading for the sake of upgrading, I’d be amazed if the hardware you’ve been pushing by gaming would last for about 1/3 of that 17 year period. Computer parts aren’t designed to last forever, and really won’t when you’re pushing them with intensive gaming for hours upon hours. Generally speaking, your components might last you 6-8 years, if you’ve got the high-end stuff. But let’s assume you bought a system 17 years ago that was a beast for it’s time, something so powerful, that even if it’s parts have degraded over time, it’s still going strong. Problem is: you will have to upgrade something eventually. Even if you’ve managed to get this far into the gaming realm with the same 17 year old hardware, I’m betting you didn’t do it with a 17 year Operating System. How much did Windows 7 cost you? Or 8.1? Or 10? Oh, and don’t think you can skirt the cost by getting a pre-built system, the cost of Windows is embedded into the cost of the machine (why else would Microsoft allow their OS to go on so many machines). Sure, Windows 10 was a free upgrade for a year, but that’s only half of it’s lifetime— You can’t get it for free now, and not for the past year. On top of that, the free period was an upgrade; you had to pay for 7 or 8 first anyway. Point is, as much as one would like to say that they didn’t need to buy a new system every so often for the sake of gaming, that doesn’t mean they haven’t been paying for hardware, and even if they’ve only been PC gaming recently, you’ll be spending money on hardware soon enough.
“PC is leading the VR—“
Let me stop you right there. If you add together the total number of Oculus Rifts and HTC Vives sold to this day, and threw in another 100,000 just for the sake of it, that number would still be under the number of PSVR headsets sold. Why could this possibly be? Well, for a simple reason: affordability. The systems needed to run the PC headsets costs $800+, and the headsets are $500 - $600, when discounted. PSVR on the other hand costs $450 for the full bundle (headset, camera, and move controllers, with a demo disc thrown in), and can be played on either a $250 - $300 console, or a $400 console, the latter recommended. Even if you want to say that the Vive and Rift are more refined, a full PSVR set, system and all, could cost just over $100 more than a Vive headset alone. If anything, PC isn’t leading the VR gaming market, the PS4 is. It’s the system bringing VR to the most consumers, showing them what the future of gaming could look like. Not to mention that as the PlayStation line grows more powerful (4.2 TFLOP PS4 Pro, 10 TFLOP “PS5…”), it won’t be long until the PlayStation line can use the same VR games as PC. Either way, this shows that there is a console equivalent to the PC VR options. Sure, there are some games you'd only be able to play on PC, but there are also some games you'd only be able to play on PSVR. …Though to be fair, if we’re talking about VR in general, these headsets don’t even hold a candle to, surprisingly, Gear VR.
“If it wasn’t for consoles holding devs back, then they would be able to make higher quality games.”
This one is based on the idea that because of how “low spec” consoles are, that when a developer has to take them in mind, then they can’t design the game to be nearly as good as it would be otherwise. I mean, have you ever seen the minimum specs for games on Steam? GTA V
Actually, bump up all the memory requirements to 8 GBs, and those are some decent specs, relatively speaking. And keep in mind these are the minimum specs to even open the games. It’s almost as if the devs didn’t worry about console specs when making a PC version of the game, because this version of the game isn’t on console. Or maybe even that the consoles aren’t holding the games back that much because they’re not that weak. Just a hypothesis. But I mean, the devs are still ooobviously having to take weak consoles into mind right? They could make their games sooo much more powerful if they were PC only, right? Right? No. Not even close. iRacing
CPU: Intel Core i3, i5, i7 or better or AMD Bulldozer or better
Memory: 8 GB RAM
GPU: NVidia GeForce 2xx series or better, 1GB+ dedicated video memory / AMD 5xxx series or better, 1GB+ dedicated video memory
These are PC only games. That’s right, no consoles to hold them back, they don’t have to worry about whether an Xbox One could handle it. Yet, they don’t require anything more than the Multiplatform games. Subnautica
So what’s the deal? Theoretically, if developers don’t have to worry about console specs, then why aren’t they going all-out and making games that no console could even dream of supporting? Low-end PCs. What, did you think people only game on Steam if they spent at least $500 on gaming hardware? Not all PC gamers have gaming-PC specs, and if devs close their games out to players who don’t have the strongest of PCs, then they’d be losing out on a pretty sizable chunk of their potential buyers. Saying “devs having to deal with consoles is holding gaming back” is like saying “racing teams having to deal with Ford is holding GT racing back.” A: racing teams don’t have to deal with Ford if they don’t want to, which is probably why many of them don’t, and B: even though Ford doesn’t make the fastest cars overall, they still manage to make cars that are awesome on their own, they don’t even need to be compared to anything else to know that they make good cars. I want to go back to that previous point though, developers having to deal with low-end PCs, because it’s integral to the next point:
“PCs are more powerful, gaming on PC provides a better experience.”
This one isn’t so much of a misconception as it is… misleading. Did you know that according to the Steam Hardware & Software Survey (July 2017) , the percentage of Steam gamers who use a GPU that's less powerful than that of a PS4Slim’s GPU is well over 50%? Things get dismal when compared to the PS4 Pro (Or Xbox One X). On top of that, the percentage of PC gamers who own a Nvidia 10 series card is about 20% (about 15% for the 1060, 1080 and 1070 owners). Now to be fair, the large majority of gamers have CPUs with considerably high clock speeds, which is the main factor in CPU gaming performance. But, the number of Steam gamers with as much RAM or more than a PS4 or Xbox One is less than 50%, which can really bottleneck what those CPUs can handle. These numbers are hardly better than they were in 2013, all things considered. Sure, a PS3/360 weeps in the face of even a $400 PC, but in this day in age, consoles have definitely caught up. Sure, we could mention the fact that even 1% of Steam accounts represents over 1 million accounts, but that doesn’t really matter compared to the 10s of millions of 8th gen consoles sold; looking at it that way, sure the number of Nvidia 10 series owners is over 20 million, but that ignores the fact that there are over 5 times more 8th gen consoles sold than that. Basically, even though PCs run on a spectrum, saying they're more powerful “on average” is actually wrong. Sure, they have the potential for being more powerful, but most of the time, people aren’t willing to pay the premium to reach those extra bits of performance. Now why is this important? What matters are the people who spent the premium cost for premium parts, right? Because of the previous point: PCs don’t have some ubiquitous quality over the consoles, developers will always have to keep low-end PCs in mind, because not even half of all PC players can afford the good stuff, and you have to look at the top quarter of Steam players before you get to PS4-Pro-level specs. If every Steam player were to get a PS4 Pro, it would be an upgrade for over 60% of them, and 70% of them would be getting an upgrade with the Xbox One X. Sure, you could still make the argument that when you pay more for PC parts, you get a better experience than you could with a console. We can argue all day about budget PCs, but a console can’t match up to a $1,000 PC build. It’s the same as paying more for car parts, in the end you get a better car. However, there is a certain problem with that…
“You pay a little more for a PC, you get much more quality.”
The idea here is that the more you pay for PC parts, the performance increases at a faster rate than the price does. Problem: that’s not how technology works. Paying twice as much doesn’t get you twice the quality the majority of the time. For example, let’s look at graphics cards, specifically the GeForce 10 series cards, starting with the GTX 1050.
1.35 GHz base clock
2 GB VRAM
This is our reference, our basis of comparison. Any percentages will be based on the 1050’s specs. Now let’s look at the GTX 1050 Ti, the 1050’s older brother.
1.29 GHz base clock
4 GB VRAM
This is pretty good. You only increase the price by about 27%, and you get an 11% increase in floating point speed and a 100% increase (double) in VRAM. Sure you get a slightly lower base clock, but the rest definitely makes up for it. In fact, according to GPU boss, the Ti managed 66 fps, or a 22% increase in frame rate for Battlefield 4, and a 54% increase in mHash/second in bitcoin mining. The cost increase is worth it, for the most part. But let’s get to the real meat of it; what happens when we double our budget? Surely we should see a massive increase performance, I bet some of you are willing to bet that twice the cost means more than twice the performance. The closest price comparison for double the cost is the GTX 1060 (3 GB), so let’s get a look at that.
1.5 GHz base clock
3 GB VRAM
Well… not substantial, I’d say. About a 50% increase in floating point speed, an 11% increase in base clock speed, and a 1GB decrease in VRAM. For [almost] doubling the price, you don’t get much. Well surely raw specs don’t tell the full story, right? Well, let’s look at some real wold comparisons. Once again, according to GPU Boss, there’s a 138% increase in hashes/second for bitcoin mining, and at 99 fps, an 83% frame rate increase in Battlefield 4. Well, then, raw specs does not tell the whole story! Here’s another one, the 1060’s big brother… or, well, slightly-more-developed twin.
1.5 GHz base clock
6 GB VRAM
Seems reasonable, another $50 for a decent jump in power and double the memory! But, as we’ve learned, we shouldn’t look at the specs for the full story. I did do a GPU Boss comparison, but for the BF4 frame rate, I had to look at Tom’s Hardware (sorry miners, GPU boss didn’t cover the mHash/sec spec either). What’s the verdict? Well, pretty good, I’d say. With 97 FPS, a 79% increase over the 1050— wait. 97? That seems too low… I mean, the 3GB version got 99. Well, let’s see what Tech Power Up has to say... 94.3 fps. 74% increase. Huh. Alright alright, maybe that was just a dud. We can gloss over that I guess. Ok, one more, but let’s go for the big fish: the GTX 1080.
1.6 GHz base clock
8 GB VRAM
That jump in floating point speed definitely has to be something, and 4 times the VRAM? Sure it’s 5 times the price, but as we saw, raw power doesn’t always tell the full story. GPU Boss returns to give us the run down, how do these cards compare in the real world? Well… a 222% (over three-fold) increase in mHash speed, and a 218% increase in FPS for Battlefield 4. That’s right, for 5 times the cost, you get 3 times the performance. Truly, the raw specs don’t tell the full story. You increase the cost by 27%, you increase frame rate in our example game by 22%. You increase the cost by 83%, you increase the frame rate by 83%. Sounds good, but if you increase the cost by 129%, and you get a 79% (-50% cost/power increase) increase in frame rate. You increase it by 358%, and you increase the frame rate by 218% (-140% cost/power increase). That’s not paying “more for much more power,” that’s a steep drop-off after the third cheapest option. In fact, did you know that you have to get to the 1060 (6GB) before you could compare the GTX line to a PS4 Pro? Not to mention that at $250, the price of a 1060 (6GB) you could get an entire PS4 Slim bundle, or that you have to get to the 1070 before you beat the Xbox One X. On another note, let’s look at a PS4 Slim…
800 MHz base clock
8 GB VRAM
…Versus a PS4 Pro.
911 MHz base clock
8 GB VRAM
128% increase in floating point speed, 13% increase in clock speed, for a 25% difference in cost. Unfortunately there is no Battlefield 4 comparison to make, but in BF1, the frame rate is doubled (30 fps to 60) and the textures are taken to 11. For what that looks like, I’ll leave it up to this bloke. Not to even mention that you can even get the texture buffs in 4K. Just like how you get a decent increase in performance based on price for the lower-cost GPUs, the same applies here. It’s even worse when you look at the CPU for a gaming PC. The more money you spend, again, the less of a benefit you get per dollar. Hardware Unboxed covers this in a video comparing different levels of Intel CPUs. One thing to note is that the highest i7 option (6700K) in this video was almost always within 10 FPS (though for a few games, 15 FPS) of a certain CPU in that list for just about all of the games. …That CPU was the lowest i3 (6100) option. The lowest i3 was $117 and the highest i7 was $339, a 189% price difference for what was, on average, a 30% or less difference in frame rate. Even the lowest Pentium option (G4400, $63) was often able to keep up with the i7. The CPU and GPU are usually the most expensive and power-consuming parts of a build, which is why I focused on them (other than the fact that they’re the two most important parts of a gaming PC, outside of RAM). With both, this “pay more to get much more performance” idea is pretty much the inverse of the truth.
“The console giants are bad for game developers, Steam doesn't treat developers as bad as Microsoft or especially Sony.”
Now one thing you might’ve heard is that the PS3 was incredibly difficult for developers to make games for, which for some, fueled the idea that console hardware is difficult too develop on compared to PC… but this ignores a very basic idea that we’ve already touched on: if the devs don’t want to make the game compatible with a system, they don’t have to. In fact, this is why Left 4 Dead and other Valve games aren’t on PS3, because they didn’t want to work with it’s hardware, calling it “too complex.” This didn’t stop the game from selling well over 10 million units worldwide. If anything, this was a problem for the PS3, not the dev team. This also ignores that games like LittleBigPlanet, Grand Theft Auto IV, and Metal Gear Solid 4 all came out in the same year as Left 4 Dead (2008) on PS3. Apparently, plenty of other dev teams didn’t have much of a problem with the PS3’s hardware, or at the very least, they got used to it soon enough. On top of that, when developing the 8th gen consoles, both Sony and Microsoft sought to use CPUs that were easier for developers, which included making decisions that considered apps for the consoles’ usage for more than gaming. On top of that, using their single-chip proprietary CPUs is cheaper and more energy efficient than buying pre-made CPUs and boards, which is far better of a reason for using them than some conspiracy about Sony and MS trying to make devs' lives harder. Now, console exclusives are apparently a point of contention: it’s often said that exclusive can cause developers to go bankrupt. However, exclusivity doesn’t have to be a bad thing for the developer. For example, when Media Molecule had to pitch their game to a publisher (Sony, coincidentally), they didn’t end up being tied into something detrimental to them. Their initial funding lasted for 6 months. From then, Sony offered additional funding, in exchange for Console Exclusivity. This may sound concerning to some, but the game ended up going on to sell almost 6 million units worldwide and launched Media Molecule into the gaming limelight. Sony later bought the development studio, but 1: this was in 2010, two years after LittleBigPlanet’s release, and 2: Media Molecule seem pretty happy about it to this day. If anything, signing up with Sony was one of the best things they could’ve done, in their opinion. Does this sound like a company that has it out for developers? There are plenty of examples that people will use to put Valve in a good light, but even Sony is comparatively good to developers.
“There are more PC gamers.”
The total number of active PC gamers on Steam has surpassed 120 million, which is impressive, especially considering that this number is double that of 2013’s figure (65 million). But the number of monthly active users on Xbox Live and PSN? About 120 million (1, 2) total. EDIT: You could argue that this isn't an apples-to-apples comparison, sure, so if you want to, say, compare the monthly number of Steam users to console? Steam has about half of what consoles do, at 67 million. Now, back to the 65 million total user figure for Steam, the best I could find for reference for PlayStation's number was an article giving the number of registered PSN accounts in 2013, 150 million. In a similar 4-year period (2009 - 2013), the number of registered PSN accounts didn’t double, it sextupled, or increased by 6 fold. Considering how the PS4 is already at 2/3 of the number of sales the PS3 had, even though it’s currently 3 years younger than its predecessor, I’m sure this trend is at least generally consistent. For example, let’s look at DOOM 2016, an awesome faced-paced shooting title with graphics galore… Of course, on a single platform, it sold best on PC/Steam. 2.36 million Steam sales, 2.05 million PS4 sales, 1.01 million Xbox One sales. But keep in mind… when you add the consoles sales together, you get over 3 million sales on the 8th gen systems. Meaning: this game was best sold on console. In fact, the Steam sales have only recently surpassed the PS4 sales. By the way VG charts only shows sales for physical copies of the games, so the number of PS4 and Xbox sales, when digital sales are included, are even higher than 3 million. This isn’t uncommon, by the way. Even with the games were the PC sales are higher than either of the consoles, there generally are more console sales total. But, to be fair, this isn’t anything new. The number of PC gamers hasn’t dominated the market, the percentages have always been about this much. PC can end up being the largest single platform for games, but consoles usually sell more copies total. EDIT: There were other examples but... Reddit has a 40,000-character limit.
This isn’t to say that there’s anything wrong with PC gaming, and this isn’t to exalt consoles. I’m not here to be the hipster defending the little guy, nor to be the one to try to put down someone/thing out of spite. This is about showing that PCs and consoles are overall pretty similar because there isn’t much dividing them, and that there isn’t anything wrong with being a console gamer. There isn’t some chasm separating consoles and PCs, at the end of the day they’re both computers that are (generally) designed for gaming. This about unity as gamers, to try to show that there shouldn’t be a massive divide just because of the computer system you game on. I want gamers to be in an environment where specs don't separate us; whether you got a $250 PS4 Slim or just built a $2,500 gaming PC, we’re here to game and should be able to have healthy interactions regardless of your platform. I’m well aware that this isn’t going to fix… much, but this needs to be said: there isn’t a huge divide between the PC and consoles, they’re far more similar than people think. There are upsides and downsides that one has that the other doesn’t on both sides. There’s so much more I could touch on, like how you could use SSDs or 3.5 inch hard drives with both, or that even though PC part prices go down over time, so do consoles, but I just wanted to touch on the main points people try to use to needlessly separate the two kinds of systems (looking at you PCMR) and correct them, to get the point across. I thank anyone who takes the time to read all of this, and especially anyone who doesn’t take what I say out of context. I also want to note that, again, thisisn’t “anti-PC gamer.” If it were up to me, everyone would be a hybrid gamer. Cheers.
[USA-CA] [H] Nearly-complete PC build (i5 CPU, 16GB RAM, MSI motherboard, Fractal Design case, Asus ROG Strix RX480 OC GPU) [W] Local cash, trade for Oculus Rift Touch
After some recent misadventures (I started out trying to upgrade this PC, but basically ended up building a new one by accident), I'd like to find a new home for my previous build. As of right now, this rig is missing only a power supply and SSD or HDD (I migrated both items into my new rig), but is otherwise fine. This machine was built and used over the past couple years for light gaming and home-lab work, and the GPU was purchased new by me from Amazon and wasn't ever used for bitcoin mining or anything like that. I've found that it handles most modern games quite well at 1080p. Here are the specs of what I'm including:
GPU: Asus ROG STRIX Radeon RX480 OC 8gb (I've read that this can be effectively 'flashed' to be an RX580 but I never tried)
CASE: Fractal Design Define Mini C with the see-through side panel (fantastic case! I jumped up to the full-size version of the same case for my new rig and have no regrets)
Also, here's an album of timestamps. I'm asking $350 (open to reasonable offers) local cash in the San Francisco Bay Area. I live in the East Bay and work in the South Bay, so I'm happy to meet up somewhere mutually convenient. I'm also open to trades for an Oculus Rift Touch setup, if anyone has one they're looking to trade. Let me know!
I've been holding off on upgrading my 670 due to the bitcoin mining price gouging; however, I think it's about that time to upgrade. I'm not interested in the 1080ti or 2080ti due to the crazy high prices. Even the 2080 seems a bit pricey at the moment. My question is, should I go with the 2070 or look to the 1070/1070ti series? I should note that I am also looking into purchasing Battlefield V, so those free downloads with the 2070 purchase are tempting. My current PC specs are below. LOOKING TO BUY: 2070 APPROXIMATE PURCHASE DATE: ASAP BUDGET RANGE: Prefer to stay under $550 USAGE FROM MOST TO LEAST IMPORTANT: Gaming; other tasks are minimal CURRENT GPU AND POWER SUPPLY: Gigabyte GeForce GTX 670 2GB PC Power & Cooling 750W ATX12V / EPS12V CURRENT MONITOR: Acer Predator XB271HU 27" Monitor (1440p) OTHER RELEVANT SYSTEM SPECS: Intel Core i5-3570K 3.4GHz Quad-Core Processor Cooler Master Hyper 212 EVO 82.9 CFM Sleeve Bearing CPU Cooler ASRock Z77 Extreme4 ATX LGA1155 Motherboard G.SKILL TridentX Series 16GB (2 x 8GB) 240-Pin DDR3 SDRAM DDR3 1600 (PC3 12800) Seagate Barracuda 1TB 3.5" 7200RPM Internal Hard Drive Cooler Master HAF 922 ATX Mid Tower Case PREFERRED WEBSITE(S) FOR PARTS: Newegg Amazon COUNTRY OF ORIGIN: USA PARTS PREFERENCES: Nvidia OVERCLOCKING/SLI OR CROSSFIRE: No MONITOR RESOLUTION: 1440p ADDITIONAL COMMENTS: There's an added incentive for me that some of the card listings come with Battlefield V (Newegg).
Mega FAQ (Or: Please come here for your questions first)
Qbundle Guide (Step by step setup & Bootstrap) https://burstwiki.org/wiki/QBundle 1( I want to mine or activate My account. Where do find the multiple coins? You only need 1, an outgoing transaction or reward reassignment will set the public key. Get them from: https://www.reddit.com/burstcoinmining/comments/7q8zve/initial_burstcoin_requests/ Or (Faucet list) https://faucet.burstpay.net/ (if this is empty, come back later) http://faucet.burst-coin.es Or https://forums.getburst.net/c/new-members-introductions/getting-started-initial-burstcoin-requests 2( I bought coins on Bittrex and want to move to my new wallet, but can't. Why? Bittrex will only send to accounts with a public key (not a Burst requirement) so see number 1 and either set the name on the account (IF you will not mine) or set the reward recipient to the pool. Either action will enable the account and allow for transfers from Bittrex. 3( I sent coins from Poloniex/anywhere to Bittrex and they don’t show up after a considerable time. Why? You need to set an unencrypted message on the transaction, informing Bittrex which account to send the funds to (this is in the directions on Bittrex). Did you do this? Contact Bittrex support with all the details and eventually you will get your funds. 4( How much can I make on Burst? https://explore.burst.cryptoguru.org/tool/calculate Gives you an average over time assuming a few things like: Average luck/100% uptime/no overlapping/fees on pool/good plot scan time (<20 seconds) if you do not have all of these, you may not see that number. 5( If I use SSD’s would I make more money? No, it’s 95% capacity and 5% scan time that determine success. More plot area = better deadlines = better chance of forging a block, or better rates from a pool. 6( What is ‘solo’ and ‘pool’ (wasn’t his name Chewbacca?) Solo is where you attempt to ‘forge’ (mine) a block by yourself; you get 100% of the block reward and fees. But you only receive funds if you forge, no burst for coming in second place. Pools allow a group of miners to ‘pool’ together their resources and when a miner wins, they give the pool the winnings (this is done by the reward assignment you completed earlier), it is then divided according to different percentages and methods and burst is sent out according to pool rules (minimum pay-out, time, etc.) 7( I have been mining for 2 days and my wallet doesn’t show any Burst WHY? Mining solo: it is win-or-lose, nothing in between, and wining is luck and plot size. Pool mining: because it costs 1 burst to send burst, the pools have either a time requirement (every X days) or a minimum amount (100 burst +) so you need to research your pool. Some pools allow for you to set the limit (cryptoGuru and similar) to be met before sending 8( How do I see what I have pending? On CryptoGuru, based pools, it’s the ‘Pending (burst)’ column, other pools, look for the numbers next to your burst ID. One is Paid and the other pending. 9( I’m part of a pool and I forged a block, but I didn’t recieve the total value of the block, why? A pool has 2 basic numbers that denote the pay-out method, in the format ‘XX-XX’ (i.e. 50-50) The first number is the % paid to the block forger (miner) and the second is the retained value, which is paid to historic ‘shares’ (or, past blocks that the pool didn’t win, but had a miner that was ‘close’ to winning with a good submitted deadline) Examples of pools: 0-100 (good for <40TB) 20-80 (30-80TB) 50-50 (60-200TB) 80-20 (150-250) 100-0 (solo mine, 150+ TB) Please note that there is an overlap as this is personal preference and just guidance; a higher historical share value means a smoother pay-out regime, which some people prefer. If fees are not factored in, or are the same on different pools, the pay-out value will be the same over a long enough period. 10( Is XXX model of hard drive good? Which one do you recommend? CHEAP is best. If you have 2 new hard drives, both covered by warranty, get the one with the lowest cost per TB (expressed as $/TB , calculated by dividing the cost by the number of terabytes) because plot size is KING, 11( How many drives can I have on my machine? For best performance, you can have up to 2 drives per thread (3 on a new fast AVX2 CPU). So that quad-core core-2-quad can have up to 8 drives, but a more modern i7 with 4 cores + hyper threading can squeeze 8 * 3 or 24 drives. (Performance while scanning will suffer) 12( Can I game while I mine? Some people have done so, but you cannot have the ‘maximum’ number of drives and play games generally. 13( Can I mine Burst and GPU mine other coins? Yes, if you CPU Mine Burst. 14( I’m GPU plotting Burst and GPU mining another coin, my plots are being corrupted, why? My advice is dedicating a GPU to either mining or plotting, don’t try to do both. 15( What is a ‘plot’? A plot is a file that contains Hashes, these hashes are used to mine burst. A plot is tied to an account, but they can be created (with the same account ID) on other machines and connected back to your miner(s). 16( Where can I trade/buy/sell Burst? A list of exchanges is maintained on https://www.reddit.com/burstcoin/ (on the right, ‘Exchanges’ tab) the biggest at the moment are Bittrex and Poloniex, some offer direct Fiat-to-Burst purchase (https://indacoin.com for example) 17( Do I have to store my Burst off the exchange? No, but it’s safer from hackers who target exchanges, if you cannot guarantee the safety or security of your home computer from Trojans etc, then it might be best to leave on an exchange (but enable 2FA security on your account PLEASE!) 18( What security measures can I take to keep my coin safe? When you create an account, sign out and back in to your wallet (to make sure you have copied the pass phrase correctly) and keep multiple copies of the key (at least one physically printed or written down and in a safe place, better in 2 places) do not disclose the passphrase to anyone. Finally use either a local wallet or a trusted web wallet (please research before using any web wallet) 19( How can I help Burst? Run a wallet, which will act as a node (or if you’re a programmer, contact the Dev team Bring attention to burst (without ‘shilling’ or trying to get people to buy) And help translate into your local language Be a productive member of the community and contribute experience and knowledge if you can, or help others get into Burst. 20( Will I get coins on the fork(s) and where will they be? There will be no new coin, and no new coins to be given/air dropped etc, the forks are upgrades to burst and there will not be a ‘classic’ or ‘new’ burst. 21( Will I need to move my Burst off of the exchange for the fork? No, your transactions are on the block chain, which will be used on the fork, they will be visible after the move; nothing will need to be done on your side. 22( Where can I read about the progress of Burst and news in general on the community? There is no finer place than https://www.burstcoin.ist/ 23( What are the communities for Burst and the central website? Main website: https://www.burst-coin.org/ Reddit: https://www.reddit.com/burstcoin and https://www.reddit.com/burstcoinmining/ Burstforum.net: https://www.burstforum.net/ Getburst forum: https://forums.getburst.net/ Official Facebook channel: https://m.facebook.com/groups/398967360565392 (these are the forums that are known to be supporting the current Dev Team) Other ways to talk to the community: Discord: https://discordapp.com/invite/RPhpjVv Telegram (General): https://t.me/burstcoin Telegram (Mining): https://t.me/BurstCoinMining 24( When will Burst partner up with a company? Burst is a currency, the USD does not ‘partner up’ with a company, the DEV team will not partner up and give over to special interests. 25( Why is the DEV team anonymous? They prefer anonymity, as it allows them to work without constant scrutiny and questions unless they wish to engage, plus the aim is for Burst to become a major contender, and this brings issues with security. They will work and produce results, they owe you nothing and if you cannot see the vision they provide then please do not ‘invest’ for short term gain. 26( When moon/Lambo/$100/make me rich? My crystal ball is still broken, come back to the FAQ later for answer (seriously, this is a coin to hold, if you want to day-trade, good luck to you) 27( How can I better educate myself and learn about Dymaxion? Read about the Dymaxion here: https://www.reddit.com/burstcoin/wiki/dymaxion 28( My reads are slow, why? There are many reasons for this, if your computer has a decent spec it’s likely due to USB3 hub issues, or plugging into a USB2 hub, but other reasons can be multiple plots in the same folder, but it’s best to visit the mining subreddit. They can help more than an simple FAQ https://www.reddit.com/burstcoinmining/ 29( I have a great idea for Burst (not proof of stake related)? Awesome! Please discuss with the DEV team on discord https://discordapp.com/invite/RPhpjVv (Please be aware that this is a public forum, you need to find who to ask/tell) 30( I have a great idea for Burst (Proof of stake related)? No. if you want a POS, find a POS coin. On the tangle which is being implemented a POS/POW/POC coin can be created, but BURST will always be POC mined. You are welcome to implement a proof of stake coin on this! 31( Will the Dev team burn any coins? Burst is not an ICO, so any coins will need to be bought to be burnt. You are welcome to donate, but the DEV team have no intention of burning any coins, or increasing the coin cap. 32( When will there be an IOS wallet? IOS wallet is completed; we are waiting for it to go on the app store. Apple is the delaying factor. 33( Why do overlapping plots matter? Plots are like collections of lottery tickets (and if only one ticket could win). Having 2 copies is not useful, and it means that you have less coverage of ‘all’ the possible numbers. It’s not good, avoid. 34( My local wallet used to run, I synchronised it before and now it says ‘stopped’. when I start it, it stops after a few seconds, what should I do? I suggest that you change the database type to portable MariaDB (on Qbundle, at the top, ‘Database’ select, ‘change database’) and then re-import the database from scratch (see 35) 35( Synchronising the block chain is slow and I have the patience of a goldfish. What can I do? On Qbundle , ‘Database’ select ‘Bootstrap chain’ and make sure the CryptoGuru repository is selected, then ‘start Import’ this will download and quickly stuff the local database (I suggest Portable MariaDB, see 34) (lol, loop) 36( What will the block reward be next month/will the block rewards run out in 6 months? https://www.ecomine.earth/burstblockreward/ Rewards will carry on into 2026, but transaction fees will be a bigger % by then, and so profitable mining will continue. 37( How can I get started with Burst (wallet/mining/everything) and I need it in a video https://www.youtube.com/watch?v=LJLhw37Lh_8 Watch and be enlightened. 38( Can I mine on multiple machines with the same account? Yes, if you want to pool mine this can be done (but be prepared for small issues like reported size being incorrect. Just be sure to keep question 33 in mind.) 39( Why do some of my drives take forever to plot? Most likely they are SMR drives, it’s best to plot onto another SSD and then move the finished plot/part of a plot across to the SMR drive as this is much quicker. SMR drives are fine on the read, just random writes that are terrible. So plot an SMR drive quickly, plot to a non SMR or better still SSD drive, in as big a chunk as possible (fewer files better) and move. a version of Xplotter, called Splotter, can do this easily. https://github.com/NoParamedic/SPlotter 40( I have a great idea; why not get listed on more exchanges!! Exchanges list coins because of 2 reasons:
The coin pays (often A LOT, seriously we’ve been asked for 50 BTC)
I suggest you speak with your exchange and ask ‘when will they offer Burst?’ 41( Do you have a roadmap? https://www.burst-coin.org/roadmap 42( Why is the price of Burst going up/down/sideways/looping through time? The price of burst is still quite dependent upon Bitcoin, meaning that if Bitcoin gains, the value of Burst gains, if Bitcoin drops then Burst also drops. If there is news for Burst then we will see something independent of Bitcoin moving. Variations can be because of people buying in bulk or selling in bulk. There are also ‘pump and dump’ schemes that we detest, that can cause spikes in price that have nothing to do with news or Bitcoin, just sad people taking advantage of others. 43( Where is the best place to go with my mining questions? https://www.reddit.com/burstcoinmining/ or https://t.me/BurstCoinMining 44( What hardware do you advise me to buy, is this computer good? See question 43 for specific questions on hardware, it depends on so many variables. The ‘best’ in my opinion is a 36 bay Supermicro storage server, usually they have dual 6-core CPU’s and space for 36 drives. No USB cables, plotting and mining monster, anything else, DYOR. 45( Where do you buy your hard drives? I have bought most from EBay in job lots, and some refurbished drives with short warranties. Everything else I have bought, from Amazon. 46( Can I mine on my Google drive/cloud based storage? In short: no. If you want to try, and get to maybe 1 TB and then find that your local connection isn’t fast enough, or that shortly after, your account is blocked for various reasons. Please be my guest. 47( Can I mine on my NAS? Some you can mine with the NAS (if it can run the miner, it can scan locally) but generally they’re not very fast. good for maybe 16 TB? Having a plot on a NAS and mining from another computer depends on the network speed between the NAS and scanning computer. I believe you can scan about 8 TB (maybe a bit more) and keep the scan times to within acceptable, but YMMV. 48( How can I set up a node? No need to set up a node, just set up a wallet (version 2.0.4) or Qbundle (2.2) and it will do the rest 49( Are the passphrases secured? I’ll leave the effort to a few people to show how secure a 12-word passphrase is: https://burstforum.net/topic/4766/the-canary-burst-early-warning-system Key point: brute forcing it will be around 13,537,856,339,904,134,474,012,675,034 years. 50( I logged into my account (maybe with a different burst ID) and see no balance!! I have dealt with this very issue multiple times, and there are only 3 options:
You have typed in the password incorrectly
You have copy-pasted the password incorrectly
You are trying to log into a ‘local wallet’ which the block chain has not finished updating
Bitcoin Mining Pc - Nehmen Sie unserem Sieger. Unser Testerteam hat unterschiedlichste Hersteller & Marken ausführlich getestet und wir zeigen Ihnen hier die Resultate. Selbstverständlich ist jeder Bitcoin Mining Pc unmittelbar bei Amazon im Lager und somit sofort bestellbar. Während die bekannten Fachmärkte leider seit vielen Jahren ausnahmslos durch Wucherpreise und zudem sehr schwacher ... MicroBitcoin CPU mining profitability calculator. On this site you can find out the income from mining on different processors and algorithms. Mining calculator yespower, yespowerr16, cpupower and yescrypt. Intel CPU i5, Xeon and new CPU AMD Ryzen. The Intel Core i5-7600K is an unlocked, overclockable quad-core processor from Intel, which makes it a great all-round CPU, and it's a dab hand at mining as well. It won't bring the kind of ... The short answer is about a penny per year, if you can get your electricity for free, and there is a catch. Here’s how you calculate this yourself. First, find out how many hashes per second your processor (or video card) can do. This page has sev... AMD Ryzen 3 1200 Quad-Core Processor can generate more than 4.33 USD monthly income with a 1745.10 H/s hashrate on the XMR - RandomX (XMRig) algorithm. Algorithm Hashrate Monthly Income Monthly BTC Income Monthly USD Income; XMR - RandomX (XMRig) 1745.10 H/s 0.03197192 XMR 0.00031377 BTC 4.33 USD Select a different hardware (If your CPU or GPU is not on the list, it means it's not profitable ...
1200+ H/s Mining with AMD Ryzen Threadripper 1950X on Monero and Electroneum (CryptoNight)
In this case, i am using an acer laptop V5-122p with AMD A6 quad core APU with 10GB ram. This laptop is 5 years old already. So kindly watch carefully the steps. Will add instructions later as i ... Graphics card prices have been spiraling out of control due to the Bitcoin cryptocurrency craze and we've been looking for alternatives. AMD's Threadripper 1950X may just be the CPU mining ... BITCOIN NEWS 11 - KOPANIE i5-2300 QUAD CORE Informatyk Kriss. Loading... Unsubscribe from Informatyk Kriss? ... Best GPU Mining Ethereum and ZCash - Duration: 6:51. Tally Ho Tech 57,728 views. 6 ... Mining Bitcoin Laptop AMD A8 quad core - Duration: 6:00. brandon coin 907 views. 6:00. How To Mine Bitcoin With Your PC & Change The World - Duration: 10:09. Crypto Wealth 601 views. 10:09 ... Lenovo Thinkpad T430 Upgrades: Quad Core i7, 8GB DDR3, 1600x900 Screen, 9-cell - Duration: 14 ... Bitcoin Mining on Old Hardware; The Intel Celeron Coppermine CPU 900MHz - Duration: 14:28. AA ...