Our online courses (all are free and have no ads):

Other useful resources

fast.ai in the news:

Avoiding the smoke - how to breath clean air

If you’re in western USA (like us) at the moment, you might be finding it hard to breath. Breathing air that contains the fallout from fires can make you feel pretty awful, and it can be bad for long-term health as well. Wildfire smoke contains fine particulate matter, known as “PM2.5”, which can be inhaled deep into the lungs. The “2.5” here refers to the size of the particles — they are 2.5 microns or smaller. To see the air quality in your area, check out this AirNow map. Once it’s orange, you might find you start feeling the effects. If it’s red or purple, you almost certainly will. (Sometimes it can appear smokey outside, but the air quality can be OK, because the smoke might be higher in the atmosphere.)

The good news is that there’s a lot you can do to make the air you breathe a lot better. You might be wondering why a data scientist like me is commenting on air filtration… The reason is that I was a leader of the Masks4All movement, including writing the first and most comprehensive scientific paper on the topic, which meant I studied filtration very closely for months. In fact, the size of particles we want to block for wildfires is very similar to the size of particles we want to block for covid-19!


The three ways that you can breathe cleaner air are to use a mask, filter your home central airconditioner or heater, and use fans with filters. I’ll show you the details below. (There’s quite a few links to places you can buy products in this post; I don’t get any commission or anything from them, they’re just things that I’ve personally found helpful.)


Therefore, you won’t be surprised to learn that one of the most effective things that you can do is to wear a mask. To block most PM2.5 particles you’ll want a mask that’s well-fitted and uses a good filter material. I’ve already prepared advice on that topic for COVID-19, and pretty much all of it is exactly the same for wildfire PM2.5, so go read this now. One bit that’s less of an issue is the “Sanitation” section — wildfire PM2.5 particles aren’t bearing disease, so you only have to worry about sanitation if your mask is actually getting dirty (or if you’ve been out in public with it on).

Personally, I like the O2 nano mask, or any well-fitted mask that you can insert a Filti filter in to. Recent aerosol science tests show that a neck gaiter folded to create two layers works well too (but make sure you add a nose clip to remove gaps around your nose). Check out Etsy for lots of mask designs that include a filter pocket and nose clip.

Choose from thousands of mask designs with a filter pocket
Choose from thousands of mask designs with a filter pocket

Filtering your home air

To clean the air in your home, the basic idea is to have it getting continually pushed through a filter. A filter is simply a piece of material which air can get through, but PM2.5 particles can’t. No filter is perfect, but there are readily-available options which work very well. Filters have a MERV rating, which tells you how many small particles they remove. For wildfire, you generally want MERV 13.

Don’t just buy the highest rating filter you can find. Filters with higher ratings have smaller holes (generally speaking), which means they also don’t let air through as fast. Remember, we want your home air going through the filter quickly, to ensure all your air is getting cleaned, so we don’t want the filter to negatively impact air-flow too much. I recommend Filtrete™ Healthy Living Air Filters. These have good air flow even for the MERV 13 spec.

Adding a filter to your central air

If you’ve got central heating or air conditioning, then you’re in luck. That will have strong fans, covering all of your rooms. The trick is to filter the air coming in to the system. Nearly all home systems simply pull their air in through a large vent inside your home. Some units have a filter slot in the unit itself, whereas for some the input vent is in a totally separate location in the house. Note that air conditioners blow air out to outside the house, but they don’t suck air in from outside the house (except, generally, for more fancy commercial building HVAC systems).

Once you’ve found the inlet vent that your central air is pulling in from, add a filter to it. If there’s already one there, make sure it’s MERV 13 or 14. You should change it every 3 months or so (depending on the brand). A vent with a filter installed looks like this:

An inlet vent, showing filter underneath
An inlet vent, showing filter underneath

NB: Most filters have an arrow on the side showing the direction of airflow. So make sure you put it the right way around! Also, make sure you buy the right size. Measure the size of your vent, and buy a filter that is at least big enough to cover the hole. If there are gaps, the air will go through them, instead of your filter!

If there’s not a obvious place to add a filter to your vent, you’ll need to get creative. It might not look pretty, but you could always just remove the vent cover and fasten the filter straight over the top, using tape, poster tack, etc.

Once you’ve got your filter in place, the most important thing is to set your central air settings such that it has the fan running all the time. Most systems have an “auto” setting , which only turns the fan on when heating or cooling. You don’t want that! Set the fan to “on”, not to “auto”. That way, you’re getting as much air through that filter as possible.

Adding filters to fans and portable A/C

I recommend having an air purifier in every room. Most air purifiers don’t really do that much, because they’re normally quiet and small (which means they don’t move much air). There are extra large purifiers for sale, but they’re very expensive, and often sold out at the moment.

But we can create our own air purifier that works as well or better than the big expensive ones. An air purifier is simply a fan blowing air through a filter. So if we use a big fan and a good filter, then we have a good air purifier! The trick is to buy a 20 inch “box fan” (which is just a fan in a 20 inch square box), and stick a 20 inch filter in front of it. We pick 20 inches because that’s pretty big, and a bigger fan and bigger filter means more filtration can happen in a given time.

I bought a few of these box fans: PELONIS 3-Speed Box Fan. I’m not saying this one is any better or worse than any other — just buy whatever you can get your hands on. You want one that has a high speed setting, to push lots of air through.

For filters, anything of the right size and MERV 13 or 14 spec should be fine. I bought this pack of 6 20 inch Filtrete filters. Generally, higher quality filters will allow better air flow. Also thicker filters can increase airflow too; e.g. instead of the 20x20x1 filters I got, you could try 20x20x4 (4 inch thick) filters.

The fans I bought have the on/off/speed switch on the front, so I first turned that to the maximum speed setting, since once I attached the filter I couldn’t access the switch any more. Then I stuck some of this adhesive foam all the way around the front face of the fan, trying to leave no gaps. The idea is that when I then stick the fan on top of this, there will be as few gaps as possible. It would probably work just as well to stick a long piece of poster tack all around the front face. Finally, I stuck the filter to the front of the fan by using a generous quantity of high quality packing tape.

The completed DIY air purifier
The completed DIY air purifier

These things are pretty noisy! But it’s a lot better than having a smoky house. They’re also pretty good for helping keep COVID-19 at bay, so if you have a shop or business, sprinkle a few of these around the place if you don’t have good filtered HVAC with a high change rate.

Another approach I’ve found useful is to buy a compact portable air conditioner. These come with a hose that blows hot air out through your window, and sucks air in through the front or back of the unit. You can stick a filter in front of where it sucks air in, using a similar approach to the fan discussed above.


Many thanks to Jim Rosenthal of Tex-Air Filters, and to Richard Corsi for the home-made air purifier idea. Jim has a fancier version for those with the budget. Thanks also to Jose-Luis Jimenez, Linsey Marr, Vladimir Zdimal, Adriaan Bax, and Kimberly Prather for many discussions that have helped me improve my (still limited!) understanding of aerosol science.

fast.ai releases new deep learning course, four libraries, and 600-page book

fast.ai is a self-funded research, software development, and teaching lab, focused on making deep learning more accessible. We make all of our software, research papers, and courses freely available with no ads. We pay all of our costs out of our own pockets, and take no grants or donations, so you can be sure we’re truly independent.

Today is fast.ai’s biggest day in our four year history. We are releasing:

Also, in case you missed it, earlier this week we released the Practical Data Ethics course, which focuses on topics that are both urgent and practical.


fastai v2

fastai is a deep learning library which provides practitioners with high-level components that can quickly and easily provide state-of-the-art results in standard deep learning domains, and provides researchers with low-level components that can be mixed and matched to build new approaches. It aims to do both things without substantial compromises in ease of use, flexibility, or performance. This is possible thanks to a carefully layered architecture, which expresses common underlying patterns of many deep learning and data processing techniques in terms of decoupled abstractions. These abstractions can be expressed concisely and clearly by leveraging the dynamism of the underlying Python language and the flexibility of the PyTorch library. fastai includes:

  • A new type dispatch system for Python along with a semantic type hierarchy for tensors
  • A GPU-optimized computer vision library which can be extended in pure Python
  • An optimizer which refactors out the common functionality of modern optimizers into two basic pieces, allowing optimization algorithms to be implemented in 45 lines of code
  • A novel 2-way callback system that can access any part of the data, model, or optimizer and change it at any point during training
  • A new data block API
  • And much more…
fastai's layered architecture

fastai is organized around two main design goals: to be approachable and rapidly productive, while also being deeply hackable and configurable. It is built on top of a hierarchy of lower-level APIs which provide composable building blocks. This way, a user wanting to rewrite part of the high-level API or add particular behavior to suit their needs does not have to learn how to use the lowest level.

To see what’s possible with fastai, take a look at the Quick Start, which shows how to use around 5 lines of code to build an image classifier, an image segmentation model, a text sentiment model, a recommendation system, and a tabular model. For each of the applications, the code is much the same.

Example of using fastai for image segmentation

Read through the Tutorials to learn how to train your own models on your own datasets. Use the navigation sidebar to look through the fastai documentation. Every class, function, and method is documented here. To learn about the design and motivation of the library, read the peer reviewed paper, or watch this presentation summarizing some of the key design points.

All fast.ai projects, including fastai, are built with nbdev, which is a full literate programming environment built on Jupyter Notebooks. That means that every piece of documentation can be accessed as interactive Jupyter notebooks, and every documentation page includes a link to open it directly on Google Colab to allow for experimentation and customization.

It’s very easy to migrate from plain PyTorch, Ignite, or any other PyTorch-based library, or even to use fastai in conjunction with other libraries. Generally, you’ll be able to use all your existing data processing code, but will be able to reduce the amount of code you require for training, and more easily take advantage of modern best practices. Here are migration guides from some popular libraries to help you on your way: Plain PyTorch; Ignite; Lightning; Catalyst. And because it’s easy to combine and part of the fastai framework with your existing code and libraries, you can just pick the bits you want. For instance, you could use fastai’s GPU-accelerated computer vision library, along with your own training loop.

fastai includes many modules that add functionality, generally through callbacks. Thanks to the flexible infrastructure, these all work together, so you can pick and choose what you need (and add your own), including: mixup and cutout augmentation, a uniquely flexible GAN training framework, a range of schedulers (many of which aren’t available in any other framework) including support for fine tuning following the approach described in ULMFiT, mixed precision, gradient accumulation, support for a range of logging frameworks like Tensorboard (with particularly strong support for Weights and Biases, as demonstrated here), medical imaging, and much more. Other functionality is added through the fastai ecosystem, such as support for HuggingFace Transformers (which can also be done manually, as shown in this tutorial), audio, accelerated inference, and so forth.

Medical imaging in fastai

There’s already some great learning material made available for fastai v2 by the community, such as the “Zero to Hero” series by Zach Mueller: part 1; part 2.

Practical Deep Learning for Coders, the course

Previous fast.ai courses have been studied by hundreds of thousands of students, from all walks of life, from all parts of the world. Many students have told us about how they’ve become multiple gold medal winners of international machine learning competitions, received offers from top companies, and having research papers published. For instance, Isaac Dimitrovsky told us that he had “been playing around with ML for a couple of years without really grokking it… [then] went through the fast.ai part 1 course late last year, and it clicked for me”. He went on to achieve first place in the prestigious international RA2-DREAM Challenge competition! He developed a multistage deep learning method for scoring radiographic hand and foot joint damage in rheumatoid arthritis, taking advantage of the fastai library.

This year’s course takes things even further. It incorporates both machine learning and deep learning in a single course, covering topics like random forests, gradient boosting, test and validation sets, and p values, which previously were in a separate machine learning course. In addition, production and deployment are also covered, including material on developing a web-based GUI for our own deep learning powered apps. The only prerequisite is high-school math, and a year of coding experience (preferably in Python). The course was recorded live, in conjunction with the Data Institute at the University of San Francisco.

After finishing this course you will know:

  • How to train models that achieve state-of-the-art results in:
    • Computer vision, including image classification (e.g.,classifying pet photos by breed), and image localization and detection (e.g.,finding where the animals in an image are)
    • Natural language processing (NLP), including document classification (e.g.,movie review sentiment analysis) and language modeling
    • Tabular data (e.g.,sales prediction) with categorical data, continuous data, and mixed data, including time series
    • Collaborative filtering (e.g.,movie recommendation)
  • How to turn your models into web applications, and deploy them
  • Why and how deep learning models work, and how to use that knowledge to improve the accuracy, speed, and reliability of your models
  • The latest deep learning techniques that really matter in practice
  • How to implement stochastic gradient descent and a complete training loop from scratch
  • How to think about the ethical implications of your work, to help ensure that you’re making the world a better place and that your work isn’t misused for harm

We care a lot about teaching, using a whole game approach. In this course, we start by showing how to use a complete, working, very usable, state-of-the-art deep learning network to solve real-world problems, using simple, expressive tools. And then we gradually dig deeper and deeper into understanding how those tools are made, and how the tools that make those tools are made, and so on. We always teach through examples. We ensure that there is a context and a purpose that you can understand intuitively, rather than starting with algebraic symbol manipulation. We also dive right into the details, showing you how to build all the components of a deep learning model from scratch, including discussing performance and optimization details.

The whole course can be completed for free without any installation, by taking advantage of the guides for the Colab and Gradient platforms, which provide free, GPU-powered Notebooks.

Deep Learning for Coders with fastai and PyTorch, the book

To understand what the new book is about, and who it’s for, let’s see what others have said about it… Soumith Chintala, the co-creator of PyTorch, said in the foreword to Deep Learning for Coders with fastai and PyTorch:

But unlike me, Jeremy and Sylvain selflessly put a huge amount of energy into making sure others don’t have to take the painful path that they took. They built a great course called fast.ai that makes cutting-edge deep learning techniques accessible to people who know basic programming. It has graduated hundreds of thousands of eager learners who have become great practitioners.

In this book, which is another tireless product, Jeremy and Sylvain have constructed a magical journey through deep learning. They use simple words and introduce every concept. They bring cutting-edge deep learning and state-of-the-art research to you, yet make it very accessible.

You are taken through the latest advances in computer vision, dive into natural language processing, and learn some foundational math in a 500-page delightful ride. And the ride doesn’t stop at fun, as they take you through shipping your ideas to production. You can treat the fast.ai community, thousands of practitioners online, as your extended family, where individuals like you are available to talk and ideate small and big solutions, whatever the problem may be.

Peter Norvig, Director of Research at Google (and author of the definitive text on AI) said:

“Deep Learning is for everyone” we see in Chapter 1, Section 1 of this book, and while other books may make similar claims, this book delivers on the claim. The authors have extensive knowledge of the field but are able to describe it in a way that is perfectly suited for a reader with experience in programming but not in machine learning. The book shows examples first, and only covers theory in the context of concrete examples. For most people, this is the best way to learn.The book does an impressive job of covering the key applications of deep learning in computer vision, natural language processing, and tabular data processing, but also covers key topics like data ethics that some other books miss. Altogether, this is one of the best sources for a programmer to become proficient in deep learning.

Curtis Langlotz, Director, Center for Artificial Intelligence in Medicine and Imaging at Stanford University said:

Gugger and Howard have created an ideal resource for anyone who has ever done even a little bit of coding. This book, and the fast.ai courses that go with it, simply and practically demystify deep learning using a hands on approach, with pre-written code that you can explore and re-use. No more slogging through theorems and proofs about abstract concepts. In Chapter 1 you will build your first deep learning model, and by the end of the book you will know how to read and understand the Methods section of any deep learning paper.

fastcore, fastscript, and fastgpu


Python is a powerful, dynamic language. Rather than bake everything into the language, it lets the programmer customize it to make it work for them. fastcore uses this flexibility to add to Python features inspired by other languages we’ve loved, like multiple dispatch from Julia, mixins from Ruby, and currying, binding, and more from Haskell. It also adds some “missing features” and cleans up some rough edges in the Python standard library, such as simplifying parallel processing, and bringing ideas from NumPy over to Python’s list type.

fastcore contains many features. See the docs for all the details, which cover the modules provided:

  • test: Simple testing functions
  • foundation: Mixins, delegation, composition, and more
  • utils: Utility functions to help with functional-style programming, parallel processing, and more
  • dispatch: Multiple dispatch methods
  • transform: Pipelines of composed partially reversible transformations


Sometimes, you want to create a quick script, either for yourself, or for others. But in Python, that involves a whole lot of boilerplate and ceremony, especially if you want to support command line arguments, provide help, and other niceties. You can use argparse for this purpose, which comes with Python, but it’s complex and verbose. fastscript makes life easier. In fact, this is a complete, working command-line application (no need for any of the usual boilerplate Python requires such as if __name__=='main'):

from fastscript import *
def main(msg:Param("The message", str),
         upper:Param("Convert to uppercase?", bool_arg)=False):
    print(msg.upper() if upper else msg)

When you run this script, you’ll see:

$ python examples/test_fastscript.py
usage: test_fastscript.py [-h] [--upper UPPER] msg
test_fastscript.py: error: the following arguments are required: msg


fastgpu provides a single command, fastgpu_poll, which polls a directory to check for scripts to run, and then runs them on the first available GPU. If no GPUs are available, it waits until one is. If more than one GPU is available, multiple scripts are run in parallel, one per GPU. It is the easiest way we’ve found to run ablation studies that take advantage of all of your GPUs, result in no parallel processing overhead, and require no manual intervention.


Many thanks to everyone who helped bring these projects to fruition, most especially to Sylvain Gugger, who worked closely with me over the last two years at fast.ai. Thanks also to all the support from the Data Institute at the University of San Francisco, and to Rachel Thomas, co-founder of fast.ai, who (amongst other things) taught the data ethics lesson and developed much of the data ethics material in the book. Thank you to everyone from the fast.ai community for all your wonderful contributions.

Forward from the 'Deep Learning for Coders' Book

To celebrate the release of fast.ai’s new course, book, and software libraries, we’re making available the foreword that Soumith Chintala (the co-creator of PyTorch) wrote for the book. To learn more, see the release announcement.

In a very short time, deep learning has become a widely useful technique, solving and automating problems in computer vision, robotics, healthcare, physics, biology, and beyond. One of the delightful things about deep learning is its relative simplicity. Powerful deep learning software has been built to make getting started fast and easy. In a few weeks, you can understand the basics and get comfortable with the techniques.

This opens up a world of creativity. You start applying it to problems that have data at hand, and you feel wonderful seeing a machine solving problems for you. However, you slowly feel yourself getting closer to a giant barrier. You built a deep learning model, but it doesn’t work as well as you had hoped. This is when you enter the next stage, finding and reading state-of-the-art research on deep learning.

However, there’s a voluminous body of knowledge on deep learning, with three decades of theory, techniques, and tooling behind it. As you read through some of this research, you realize that humans can explain simple things in really complicated ways. Scientists use words and mathematical notation in these papers that appear foreign, and no textbook or blog post seems to cover the necessary background that you need in accessible ways. Engineers and programmers assume you know how GPUs work and have knowledge about obscure tools.

This is when you wish you had a mentor or a friend that you could talk to. Someone who was in your shoes before, who knows the tooling and the math–someone who could guide you through the best research, state-of-the-art techniques, and advanced engineering, and make it comically simple. I was in your shoes a decade ago, when I was breaking into the field of machine learning. For years, I struggled to understand papers that had a little bit of math in them. I had good mentors around me, which helped me greatly, but it took me many years to get comfortable with machine learning and deep learning. That motivated me to coauthor PyTorch, a software framework to make deep learning accessible.

Jeremy Howard and Sylvain Gugger were also in your shoes. They wanted to learn and apply deep learning, without any previous formal training as ML scientists or engineers. Like me, Jeremy and Sylvain learned gradually over the years and eventually became experts and leaders. But unlike me, Jeremy and Sylvain selflessly put a huge amount of energy into making sure others don’t have to take the painful path that they took. They built a great course called fast.ai that makes cutting-edge deep learning techniques accessible to people who know basic programming. It has graduated hundreds of thousands of eager learners who have become great practitioners.

In this book, which is another tireless product, Jeremy and Sylvain have constructed a magical journey through deep learning. They use simple words and introduce every concept. They bring cutting-edge deep learning and state-of-the-art research to you, yet make it very accessible.

You are taken through the latest advances in computer vision, dive into natural language processing, and learn some foundational math in a 500-page delightful ride. And the ride doesn’t stop at fun, as they take you through shipping your ideas to production. You can treat the fast.ai community, thousands of practitioners online, as your extended family, where individuals like you are available to talk and ideate small and big solutions, whatever the problem may be.

I am very glad you’ve found this book, and I hope it inspires you to put deep learning to good use, regardless of the nature of the problem.

Soumith Chintala, co-creator of PyTorch

Applied Data Ethics, a new free course, is essential for all working in tech

Today we are releasing a free, online course on Applied Data Ethics, which contains essential knowledge for anyone working in data science or impacted by technology. The course focus is on topics that are both urgent and practical, causing real harm right now. In keeping with the fast.ai teaching philosophy, we will begin with two active, real-world areas (disinformation and bias) to provide context and motivation, before stepping back in Lesson 3 to dig into foundations of data ethics and practical tools. From there we will move on to additional subject areas: privacy & surveillance, the role of the Silicon Valley ecosystem (including metrics, venture growth, & hypergrowth), and algorithmic colonialism.

If you are ready to get started now, check out the syllabus and reading list or watch the videos here. Otherwise, read on for more details!

Issues related to data ethics make headlines daily, as real people are harmed by misuse
Issues related to data ethics make headlines daily, as real people are harmed by misuse

There are no prerequisites for the course. It is not intended to be exhaustive, but hopefully will provide useful context about how data misuse is impacting society, as well as practice in critical thinking skills and questions to ask. This class was originally taught in-person at the University of San Francisco Data Institute in January-February 2020, for a diverse mix of working professionals from a range of backgrounds (as an evening certificate courses).

About Data Ethics Syllabi

Data ethics covers an incredibly broad range of topics, many of which are urgent, making headlines daily, and causing harm to real people right now. A meta-analysis of over 100 syllabi on tech ethics, titled “What do we teach when we teach tech ethics?” found that there was huge variation in which topics are covered across tech ethics courses (law & policy, privacy & surveillance, philosophy, justice & human rights, environmental impact, civic responsibility, robots, disinformation, work & labor, design, cybersecurity, research ethics, and more– far more than any one course could cover). These courses were taught by professors from a variety of fields. The area where there was more unity was in outcomes, with abilities to critique, spot issues, and make arguments being some of the most common desired outcomes for tech ethics course.

There is a ton of great research and writing on the topics covered in the course, and it was very tough for me to cut the reading list down to a “reasonable” length. There are many more fantastic articles, papers, essays, and books on these topics that are not included here. Check out my syllabus and reading list here.

A note about the fastai video browser

There is an icon near the top left of the video browser that opens up a menu of all the lesson. An icon near the top right opens up the course notes and a transcript search feature.

Use the icons on the top left and right of the video browser to collapse/expand a menu and course notes/transcript search
Use the icons on the top left and right of the video browser to collapse/expand a menu and course notes/transcript search

Topics covered

Lesson 1: Disinformation

From deepfakes being used to harass women, widespread misinformation about coronavirus (labeled an “infodemic” by the WHO), fears about the role disinformation could play in the 2020 election, and news of extensive foreign influence operations, disinformation is in the news frequently and is an urgent issue. It is also indicative of the complexity and interdisciplinary nature of so many data ethics issues: disinformation involves tech design choices, bad actors, human psychology, misaligned financial incentives, and more.

Watch the Lesson 1 video here.

Lesson 2: Bias & Fairness

Unjust bias is an increasingly discussed issue in machine learning and has even spawned its own field as the primary focus of Fairness, Accountability, and Transparency (FAccT). We will go beyond a surface-level discussion and cover questions of how fairness is defined, different types of bias, steps towards mitigating it, and complicating factors.

Watch the Lesson 2 video here.

Lesson 3: Ethical Foundations & Practical Tools

Now that we’ve seen a number of concrete, real world examples of ethical issues that arise with data, we will step back and learn about some ethical philosophies and lenses to evaluate ethics through, as well as considering how ethical questions are chosen. We will also cover the Markkula Center’s Tech Ethics Toolkit, a set of concrete practices to be implemented in the workplace.

Watch the Lesson 3 video here.

Lesson 4: Privacy and surveillance

Huge amounts of data are being collected about us: apps on our phones track our location, dating sites sell intimate details, facial recognition in schools records students, and police use large, unregulated databases of faces. Here, we discuss real-world examples of how our data is collected, sold, and used. There are also concerning patterns of how surveillance is used to suppress dissent and to further harm those who are already marginalized.

Watch the Lesson 4 video here.

Lesson 5: How did we get here? Our Ecosystem

News stories understandably often focus on one instance of a particular ethics issue at a particular company. Here, I want us to step back and consider some of the broader trends and factors that have resulted in the types of issues we are seeing. These include our over-emphasis on metrics, the inherent design of many of the platforms, venture capital’s focus on hypergrowth, and more.

Watch the Lesson 5 video here.

Lesson 6: Algorithmic Colonialism, and Next Steps

When corporations from one country develop and deploy technology in many other countries, extracting data and profits, often with little awareness of local cultural issues, a number of ethical issues can arise. Here we will explore algorithmic colonialism. We will also consider next steps for how students can continue to engage around data ethics and take what they’ve learned back to their workplaces.

Watch the Lesson 6 video here.

For the applied data ethics course, you can find the homepage here, the syllabus and reading list and watch the videos here.

Essential Work-From-Home Advice: Cheap and Easy Ergonomic Setups

You weren’t expecting to spend 2020 working from home. You can’t afford a fancy standing desk. You don’t have a home office, or even much spare space, in your apartment. Your neck is getting a permanent crick from hunching over your laptop on the couch. While those of us who are able to work from home are privileged to have this option, we still don’t want to permanently damage our backs, necks, or arms from a bad ergonomic setup.

This is not a post for ergonomic aficionados (the setups I share could all be further optimized). This is a post for folks who don’t know where to get started, have a limited budget, and are willing to try simple, scrappy approaches. Key takeway: for 34 dollars (21 for a good mouse, and 13 for a cheap keyboard), as well as some household items, you can create an ergonomic setup like the one below. I will show many other options throughout the post, for both sitting and standing, as well as approaches you can easily assemble/disassemble (if you are using the family dinner table and need to clear it off each evening).

While visiting family, I created an ergonomic setup on a counter
While visiting family, I created an ergonomic setup on a counter

You can permanently damage your body with bad ergonomics

You can permanently damage your back, neck, and wrists from working without an ergonomic setup. Almost two decades ago, my partner Jeremy suffered from repetitive stress injury due to working without an ergonomic setup. At the time, his arms were paralyzed and he had to take months off from work. Even now and after years filled with good ergonomics and yoga, this still impacts his life, severely limiting how much time he can spend in cars or on planes, and creating painful flare-ups. Please take this issue seriously.

Key advice: Have a separate keyboard and mouse

The most important thing to know is that you want your screen approximately at eye height, and your elbows at approximately right angles to your torso as they type and use the mouse. This is the case whether you are sitting or standing. If you are using a laptop, this will be impossible with the built-in keyboard and trackpad (no matter how nice they are). It is essential to have a separate keyboard and mouse. If you only do one thing to address ergonomics, obtain a separate keyboard and mouse.

If you can’t afford an external monitor, no worries, you can just elevate your laptop. Over the years, I have used cardboard boxes, drinking glasses, bottles of soda, board games, and stacks of books to elevate my laptop. I will recommend some keyboards and mice that I like below, but anything is better than using the ones built into your laptop (since that forces you to keep your screen at the wrong height). For example, the picture in the intro is of a set-up I created while visiting a family member’s apartment in 2014, using books and a cardboard box to elevate my keyboard, mouse, and laptop to the appropriate heights.

For the deep learning study group, I routinely used a brown cardboard box. Bonus: I could store everything in the box when we had the clear out of that room each night.
For the deep learning study group, I routinely used a brown cardboard box. Bonus: I could store everything in the box when we had the clear out of that room each night.

Above is a picture from the deep learning study group, which meets 5 days a week, for 7 weeks, every time we run the deep learning course. I use a brown cardboard box to elevate my keyboard. We have to clear out of that conference room each evening, and it is simple for me to put my items in the box. This sort of solution could work if you don’t have a dedicated office space in your home, and need to be able to set up/take down your workstation regularly.

I rarely worked in coffee shops pre-pandemic (and never do now), but when I had to I would still try to create an ergonomic setup (and go to a coffeeshop where there was enough space!). Here, I’ve stacked my laptop on top of my rolled-up backpack. Ideally, my screen would be higher, but this is still better than having it at table level. Don’t let the perfect be the enemy of the good. Every step you take towards a more ergonomic setup is helpful.

When working at a coffee shop (pre-pandemic), I brought an external keyboard and mouse, and used my rolled-up backpack to raise the height of my laptop screen
When working at a coffee shop (pre-pandemic), I brought an external keyboard and mouse, and used my rolled-up backpack to raise the height of my laptop screen

About standing desks

If you have a regular desk (or even just a table) at home and want a standing desk, one option is to convert it using the $22 standing desk approach, which involves an Ikea side table and shelf. I had a previous job in which this was quite popular. Here is a photo of my work desk from that time.

In a previous job, many of us set up $22 standing desks using Ikea side tables
In a previous job, many of us set up $22 standing desks using Ikea side tables

Standing on a hard floor can be difficult for your back. I have a GelPro mat, which I love. If you can’t afford a GelPro mat, standing on a folded-up yoga mat works great too.

Note that standing desks are not a cure-all. I’ve often seen people with expensive standing-desk converters (also known as desktop risers) that still have their monitor way too low. Even if you have an external monitor and desktop riser, makes sure your monitor is at an appropriate height. It is likely you will still need to stack it on top of something. If you don’t like the aesthetics of using books or other household items, you can buy a monitor stand, such as this one.

Using a standing desk with poor posture is not very ergonomic, so be cognizant of when you start feeling fatigued. I prefer to switch between standing and sitting throughout the day, as my energy fluctuates.

Budget Recommendations

My “budget recommendation” would be to get an Anker vertical mouse for $21 and literally any keyboard. If you have to choose, I’ve found that having a good mouse is way more important than a good keyboard. It is important that you get some keyboard though, so that you can elevate your laptop screen. In the setup below, I’m using a lightweight travel keyboard that isn’t particularly ergonomic, but it works fine.

The barista at this coffee shop kindly let me use 2 plastic tubs to prop up my laptop.
The barista at this coffee shop kindly let me use 2 plastic tubs to prop up my laptop.

I realize that at a time when many Americans do not have enough to eat, that you may not have 34 dollars to spare (21 dollars for a mouse and 13 dollars for a cheap keyboard). However, if this is an option for you, it is well worth the cost. If you permanently damage your back, neck, or arms, no amount of money may be enough to heal them later.

Other products I like

My favorite mouse is the Logitech wireless trackball mouse. I have also used and liked the Anker vertical mouse. For keyboards, I like Goldtouch (I use an older version of this one) or the Microsoft Ergonomic Keyboard. And if you are looking for a compact, lightweight travel keyboard, I like the iClever foldup keyboard.

As mentioned above, GelPro mats are great if you are going to be standing, and a folded-up yoga mat is a cheaper alternative.

I have a Roost portable, lightweight laptop stand, which is great, although I can’t use it since I switched from a Macbook Air to a Microsoft Surface Pro. None of the links in this post are affiliate links; I’m just recommending what I’ve personally used and like.

For more about home office set-ups, Jeremy recently posted a twitter thread about his preferred computer set-up (which includes some pricier options). It’s also worth noting that his desk has a small footprint, and fits in the corner of our living room.