Page 1 of 6

Charts, Graphs, and Statistics of the Future

Posted: Tue May 18, 2021 10:49 am
by Yuli Ban
Relaunch of the old thread
Image

Look at that beautiful curve!

Re: Charts, Graphs, and Statistics of the Future

Posted: Tue May 18, 2021 8:53 pm
by Yuli Ban
Found the old collection of Gartner Hype Cycle graphs!

Re: Charts, Graphs, and Statistics of the Future

Posted: Wed May 19, 2021 12:14 am
by Yuli Ban
Image

Re: Charts, Graphs, and Statistics of the Future

Posted: Thu May 20, 2021 12:22 pm
by wjfox
Image

Re: Charts, Graphs, and Statistics of the Future

Posted: Sun May 23, 2021 8:50 pm
by wjfox
The resolution of microscopes.

We're currently hitting the realm of individual atoms.

However, to reach proton-scale imagery (five orders of magnitude smaller) will likely take another 150+ years.


Image

Re: Charts, Graphs, and Statistics of the Future

Posted: Tue Jun 01, 2021 5:21 pm
by Yuli Ban
Image

Re: Charts, Graphs, and Statistics of the Future

Posted: Tue Jun 15, 2021 11:32 pm
by Yuli Ban
Image

Re: Charts, Graphs, and Statistics of the Future

Posted: Tue Jun 15, 2021 11:35 pm
by Yuli Ban
Image

"ZeRO-Infinity? What?"

Ooh boy, here's the context!
ZeRO-Infinity and DeepSpeed: Unlocking unprecedented model scale for deep learning training
Since the DeepSpeed optimization library was introduced last year, it has rolled out numerous novel optimizations for training large AI models—improving scale, speed, cost, and usability. As large models have quickly evolved over the last year, so too has DeepSpeed. Whether enabling researchers to create the 17-billion-parameter Microsoft Turing Natural Language Generation (Turing-NLG) with state-of-the-art accuracy, achieving the fastest BERT training record, or supporting 10x larger model training using a single GPU, DeepSpeed continues to tackle challenges in AI at Scale with the latest advancements for large-scale model training. Now, the novel memory optimization technology ZeRO (Zero Redundancy Optimizer), included in DeepSpeed, is undergoing a further transformation of its own. The improved ZeRO-Infinity offers the system capability to go beyond the GPU memory wall and train models with tens of trillions of parameters, an order of magnitude bigger than state-of-the-art systems can support. It also offers a promising path toward training 100-trillion-parameter models.
ZeRO-Infinity at a glance: ZeRO-Infinity is a novel deep learning (DL) training technology for scaling model training, from a single GPU to massive supercomputers with thousands of GPUs. It powers unprecedented model sizes by leveraging the full memory capacity of a system, concurrently exploiting all heterogeneous memory (GPU, CPU, and Non-Volatile Memory express or NVMe for short). Learn more in our paper, “ZeRO-Infinity: Breaking the GPU Memory Wall for Extreme Scale Deep Learning.” The highlights of ZeRO-Infinity include: 
  • Offering the system capability to train a model with over 30 trillion parameters on 512 NVIDIA V100 Tensor Core GPUs, 50x larger than state of the art. 
  • Delivering excellent training efficiency and superlinear throughput scaling through novel data partitioning and mapping that can exploit the aggregate CPU/NVMe memory bandwidths and CPU compute, offering over 25 petaflops of sustained throughput on 512 NVIDIA V100 GPUs.
  • Furthering the mission of the DeepSpeed team to democratize large model training by allowing data scientists with a single GPU to fine-tune models larger than Open AI GPT-3 (175 billion parameters).
  • Eliminating the barrier to entry for large model training by making it simpler and easier—ZeRO-Infinity scales beyond a trillion parameters without the complexity of combining several parallelism techniques and without requiring changes to user codes. To the best of our knowledge, it’s the only parallel technology to do this.

Re: Charts, Graphs, and Statistics of the Future

Posted: Tue Jun 15, 2021 11:39 pm
by Yuli Ban
Image
Note that the cost per gigabyte of data storage provided by a USB flash drive has fallen from more than $8,000 when they were first
introduced about ten years ago, to 94 cents today.
 
That’s a decline of 99.988% in ten years.
 
Just think about that for a minute.
A comment from 2013.
 
I don't know how much a flash gigabyte costs now, but I can only assume it's cheap if I can't even find a 1GB drive for sale, 4GB drives are cheaper than a bag of hard candy, and you can buy a 10-pack for the price of a hamburger. 40 gigabytes for a burger.

Re: Charts, Graphs, and Statistics of the Future

Posted: Tue Jun 15, 2021 11:41 pm
by Yuli Ban
Image