This post may contain paid links to my personal recommendations that help to support the site!

If you’re looking to do data science, you’d ideally want a powerful machine to crunch those numbers. Have you ever wondered how much RAM is actually needed for your data science work?

8 to 16 GB of Random Access Memory (RAM) is ideal for data science on a computer. Data science requires relatively good computing power. 8 GB is sufficient for most data analysis work but 16 GB is more than sufficient for heavy use of machine learning models. However, cloud computing can be used when RAM is limited.

You must have noticed that I gave a range for the ideal RAM for data science and not any specific number. That’s because of the different use cases of data science. Let’s now explore more about RAM and when you should get a machine with 8 GB and when you should get one with 16 GB.

What is Random Access Memory (RAM)?

person holding string lights photo
Photo by David Cassolato on

Put simply, the RAM of a machine is the amount of short-term memory it has. Similar to us humans, computers operate using both short-term memory storage (RAM) and longer-term memory storage, which can be either a Hard Disk Drive (HDD) or Solid-state Drive (SSD).

A simple analogy can be drawn with paper in an office. Documents that you would need more often are placed near you, similar to the short-term RAM. The documents that you would use less often but still require to be lying around of reference would be similar to long-term memory storage.

This short-term memory option gives the computer quick access when working on smaller tasks. In the case of data science, most of the data work would rely on the RAM for computing power.

If you’d like to know more about RAM, here’s a simple video I found that is great for beginners.

In What Cases Should You Have 8 GB of RAM?

Let’s run through some of the possible situations where your choice of getting a machine with only 8 GB is ideal. Being the option with the smaller memory space, it can be surprising to some to know that a large proportion of data science work can still be done. Here are some scenarios I could think of.

1. When Working With Small Datasets

The lesser the data, the more likely your work would need less computational effort. If you are dealing more with the prescriptive side of data science, you might only be dealing with small datasets. This means that most of your data work is stored on flat-file types like Excel sheets or Comma-separated Values (CSV) files.

With data computations at such a small scale, most machines with 8 Gigabytes of RAM should be sufficient.

2. When Most of Your Computing is Handled by Cloud Computing.

In most cases of work with large datasets for model training, the computing for number crunching work does not actually come from the computer locally. These big data computations are sent to the cloud for processing.

In such scenarios, the need for having any RAM that is higher than 8 GB is not necessary, since the RAM will only be used to run programs for creating scripts or viewing reports on Python or R. The cloud should be taking all the heavy-lifting.

In fact, cloud computing comes together with machine learning quite often. The difference in computing is many times greater with cloud computing as compared to RAM when working on recommender systems built on fresh, live data.

Here’s a video introduction on cloud computing for data science if you are curious to know more.

3. When Resources Are Limited

Of course, if resources are limited in your data science project, then cloud computing is not likely to be your main source of computing power. This your most likely scenario when you’re working on a personal side-project, with not much funding to engage on a cloud computing service.

If you are tight on a budget, I would say that having 8 GB should be good enough for most simple algorithms running locally.

With that said, if you are still keen on doing an ambitious personal project on deep neural networks, then you might want to consider having a machine with more juice, such as a RAM of at least 16 GB. Let’s explore more about when you should consider a 16 GB RAM for your next data science project.

In What Cases Should You Have 16 GB of RAM?

1. When Processing Speed is Priority

I get it, sometimes the wait for your algorithm to finish its run can really be a drag. This is where having a machine with 16 GB should shine. Having double to amount of memory as that of the 8 GB machines, you would have faster processing times with each run. You can finally get the feedback from that new modification in your model, faster.

Machines with 16 GB really do pack a punch – they are able to handle most machine learning algorithms or heavy-use of AI models.

2. When Cloud Computing is not an Option

Let’s say you are handling more of the exploratory data analysis in your company and would only require samples to better understand data. For this same reason, your company has yet to implement cloud computing within your infrastructure. This should be a feasible option to consider having a machine with 16 GB of RAM.

Where Can You Get Good RAM for Data Science?

You can get 2x 8GB RAM options to get the full 16GB just to be safe. You can check out two really solid options such as the Corsair Vengeance LPX and the Corsair Vengeance RGB PRO. These are some safe options to get out there if you’re looking for a value for money RAM option.

Related Questions

Is it Worth Having RAM above 16 GB?

According to some discussions found on Kaggle Forums, most commented that 16 GB was more than sufficient for data analysis work. They recommended using cloud systems as alternatives to 16 GB RAMs if the computing gets too much for your machine.

Additionally, I personally do not believe that having a RAM memory above 16 GB. The extra cost to find a laptop with such a high RAM is not worth the speed increase you will get. You would be much better off going with a cloud computing solution! I personally own a machine with 16 GB of RAM.

Can I Get a Machine Below 8 GB?

Getting a machine that’s below 8 GB is not recommended when considering how computationally intensive data analysis can get. In another forum, a Kaggle Forum commenter added that the minimum RAM requirement for data science should be at least 8 GB.

What Other Parts Do I have to Consider in a Machine for Data Science?

1. More Cores

Having more cores on your machine can help to boost processing speeds. This is especially useful when you’re using threaded solvers in your data analysis. This means that the machine is able to utilize multi-threaded runs that run in parallel. Your run time may be cut significantly when combined with more RAM.

2. A Better GPU

The GPU or the Graphics Processing Unit is a means for a computer to run jobs in parallel. According to this GPU testing done on AI jobs, GPUs are found to be necessary for training and development of machine learning models.

Are Desktop PCs required for Data Science or are Laptops sufficient?

Both are useful depending on the requirements. A desktop would provide more customizable and cheaper options for RAM and GPUs, but require more expertise in piecing together. Although these are great for enhancing AI jobs, they require some knowledge of PC building.

A laptop would typically come with smaller RAM sizes, starting from 4 GB. Although this can be a limiting factor when doing data science on your laptop, it can be complemented with cloud services for a simple and fast experience.

How Much RAM is Required for Machine Learning?

16 GB of RAM is required for machine learning for most uses. Machine learning runs typically on the cloud and does not require much processing power on local computers. However, smaller-scale machine learning runs locally would require at least 16 GB to run models efficiently.

Final Thoughts

The RAM of your machine matters when doing data science. A range between 8 GB and 16 GB would be the best for most choices of data science projects. You’d just have to evaluate what’s more of a priority to your project and select accordingly. Choose your next machine wisely!

My Favorite Learning Resources:

Here are some of the learning resources I’ve personally found to be useful as a data analyst and I hope you find them useful too!

These may contain affiliate links and I earn a commission from them if you use them.

However, I’d honestly recommend them to my juniors, friends, or even my family!

My Recommended Learning Platforms!

Learning PlatformWhat’s Good About the Platform?
1CourseraCertificates are offered by popular learning institutes and companies like Google & IBM
2DataCampComes with an integrated coding platform, great for beginners!
3PluralsightStrong focus on data skills, taught by industry experts
4StratascratchLearn faster by doing real interview coding practices for data science
5UdacityHigh-quality, comprehensive courses

My Recommended Online Courses + Books!

TopicOnline CoursesBooks
1Data AnalyticsGoogle Data Analytics Professional Certificate
2Data ScienceIBM Data Science Professional Certificate
3ExcelExcel Skills for Business Specialization
4PythonPython for Everybody SpecializationPython for Data Analysis
5SQLIntroduction to SQLSQL: The Ultimate Beginners Guide: Learn SQL Today
6TableauData Visualization with TableauPractical Tableau
7Power BIGetting Started with Power BI DesktopBeginning Microsoft Power BI
8R ProgrammingData Science: Foundations using R SpecializationLearning R
9Data VisualizationBig Book of Dashboards

To see all of my most up-to-date recommendations, check out this resource I’ve put together for you here.

More Articles For You

Similar Posts