Quantifying Rafael Nadal’s Dominance with French Open Data

The French Open, also known as Roland-Garros, began on May 26th in Paris and culminates in championship matches held on June 8th and 9th. It is the second of four major tennis tournaments collectively known as the Grand Slam—the Australian Open, Wimbledon, and the U.S. Open being the other three. 

A big question heading into the tournament was whether tennis superstar Rafael Nadal would compete after being unable to last year due to injury. Nadal has won the French Open 14 times in his career—the most of any individual player, male or female. Although he was defeated in the first round of the tournament this year by Alexander Zverev, Nadal’s career record is nevertheless impressive.

In this blog post we’ll explore his French Open data and try to identify the je ne sais quoi that led to his record-breaking success and earned him the nickname “The King of Clay.”

French Open Titles

In 19 career appearances, Rafael Nadal has won the French Open 14 times. The next closest male is Björn Borg with six titles. On the female side, Chris Evert holds the record with seven.

Taking our scope beyond just French Open data, no player, male or female, has 14 titles at any one of the Grand Slam tournaments. Aside from Nadal, the only active player in the table below is Novak Djokovic, who holds the closest active record with 10 Australian Open titles. Already playing the tournament 19 times, career longevity becomes a challenge if Djokovic were to unseat Nadal as the winningest player at a single major tournament.

Among these tennis greats, Nadal’s stretch of success at the French Open is truly eye-catching.

The table above comes from Flatiron’s analysis of individual titles at a single Grand Slam tournament in the Open Era (1968 to present). Court surfaces are in parentheses. 
Data source: Wikipedia
The table above comes from Flatiron’s analysis of individual titles at a single Grand Slam tournament in the Open Era (1968 to present). Court surfaces are in parentheses. 
Data source: Wikipedia

The Court at the French Open

A unique aspect of the French Open is its surface. Rather than the traditional blue or green hard court (typically concrete) that you’re likely to find at a nearby park or sports complex, the French Open features an orange-red surface made of densely packed clay. This surface results in a distinct gameplay that rewards defensive play and makes the ball behave differently off the bounce. Another challenge posed by clay is the reduced friction between the shoe and the surface, requiring players to slide into position to strike the ball as they move around the court.

Rafael Nadal on the clay court at the French Open.
Rafael Nadal on the clay court at the French Open. 
Source: rolandgarros.com

Many experts attribute Nadal’s success at the French Open to his athleticism and emphasis on power and ball spin off the racket. These characteristics, accentuated by the clay court, allow him to hit returns other players cannot and to remain on the offensive even while his opponent is serving.

Comparing Match Stats with French Open Data

The Big Three—composed of Novak Djokovic, Roger Federer, and Rafael Nadal—is the nickname for the trio considered the greatest male tennis players of all time. Even among this group Nadal’s return statistics stand out. In French Open matches he wins an average of 49% of return points compared to Djokovic’s 44% and Federer’s 41%. Winning nearly 50% of return points is unheard of, especially when top players are expected to win 70% or more of the points in which they are serving.

Additionally, Nadal’s median Ace Rate Against—a measure of how often a player is unable to touch their opponent’s serve—is just 2.7% at the French Open, the lowest among all three athletes at any Grand Slam. To take the opponent’s serving advantage away this dramatically is clear evidence of Nadal’s impressive upper hand on the clay court.

The figure below compares match statistics for Djokovic, Federer, and Nadal at Grand Slam tournaments across their respective careers.

The chart above comes from Flatiron’s analysis of career matches at Grand Slam tournaments for Rafael Nadal, Roger Federer, and Novak Djokovic.
The chart above comes from Flatiron’s analysis of career matches at Grand Slam tournaments for Rafael Nadal, Roger Federer, and Novak Djokovic.
Data source: Tennis Abstract

The Greatest Sports Records of All Time

The French Open data so far shows that no tennis player has dominated one of the majors the way Nadal has. But how does his record compare to non-tennis records? What methodology could we use to compare apples and oranges, or perhaps, tennis balls, basketballs, and hockey pucks?

There are a number of ways to approach any given problem in the field of data science. In fact, for a field known for its quantitative rigor, there are many aspects that allow for creativity. Designing data visualizations, weaving insights into a cohesive story, or, in our case, developing a methodology for comparing athletic achievements are all ways in which creative thinking is an asset.

For our problem, we could visualize the difference relative to the next best record-holder or compare the length of time previous records were held. These are interesting ideas, but really only compare two data points head-to-head. With sample size in mind, let’s try to contextualize how far out of the ordinary Nadal’s 14 championship wins are and do the same for a few other sports achievements.

The Z-Score

The number of wins at a tennis tournament are of a different magnitude than, say, the number of career points in basketball. To make things fair, we need to standardize. The “standard score,” sometimes called a “Z-score,” is a way for us to compare data measured on different scales. It can be calculated by taking each data point, x, subtracting the average, a, of data points from the same sample, and dividing by the standard deviation, S, a measure of how much variability there is in our data. Written in equation form, we have:

As an example, on the top 50 list of most Grand Slam titles at a single tournament, the average is 2.2 with a standard deviation of 2.1. Therefore, the Z-score for Rafael Nadal’s French Open record is:

Z-scores are unitless, meaning we can calculate and compare these for records from different categories, even different sports. It is important, however, to know that z-scores can be susceptible to skewed data. To confidently say if one is more extreme than another, we may need to account for distributional characteristics. For now, we can say with certainty that a positive z-score means a data point is atypically high relative to the population from which it comes. Conversely, a negative one means it is unusually low compared to its population. Simply put, the further a z-score is from zero, the more unusual, or further from average, it is.

Nadal vs. Gretzky, Clark, and Ledecky 

In the chart below, we compare Nadal’s record with three notable records:

  1. The National Hockey League career points record held by Wayne Gretzky since 1999.
  2. The recently-set women’s college basketball scoring record by Caitlin Clark.
  3. Katie Ledecky’s ever-growing count of gold medals for swimming at the Olympics and World Aquatics Championships.
The chart above comes from Flatiron’s analysis comparing Nadal’s 14 French Open titles to three other highly regarded athletic records.
The chart above comes from Flatiron’s analysis comparing Nadal’s 14 French Open titles to three other highly regarded athletic records. 
Data sources: Ultimate Tennis Statistics, NHL, Sports Reference, Wikipedia

It’s clear that all four athletes’ achievements stand far beyond the competition. Nadal’s 14 titles put him farthest from average of these four records, but the right-skewed distributions make this an imperfect comparison. To definitively say whether his record is the greatest of all time, further analysis should include exploring measures that are more robust to skew and outliers. With Gretzky retired and Clark moving on to the WNBA, only Nadal and Ledecky can extend their records. Coincidentally, Ledecky also has an opportunity in Paris with the Olympics taking place this summer.

What’s Next?

Is Nadal’s record of 14 titles at the French Open the most impressive athletic feat of all time? One could certainly argue it is. Can it be broken? Only time will tell. As we’ve seen, even among tennis legends like Serena Williams, Roger Federer, and Novak Djokovic, Rafael Nadal stands alone more than 5.6 standard deviations above the average. Perhaps if a young up-and-coming player can perfect power, spin, and mobility on clay, they, too, could become clay court royalty. That is, if they can also remain at the top of their game over a multi-decade career the way Nadal has. As his fans say, “¡Vamos Rafa!”

Learn Data Science at Flatiron

Unlocking the power of data goes beyond basic visualizations. Our Data Science Bootcamp teaches data visualization techniques, alongside machine learning, data analysis, and much more. Equip yourself with the skills to transform data into insightful stories that drive results. Visit our website to learn more about our courses and how you can become a data scientist.

The Art of Data Exploration

Exploratory Data Analysis (EDA) is an essential initial stage in the workflow of data analysts and data scientists. It provides a comprehensive understanding of the dataset prior to delving into advanced analyses. Through data exploration, we summarize the main characteristics of a dataset. In particular, we reveal patterns, anomalies, and relationships among variables with the help of a variety of data exploration techniques and statistical tools. This process establishes a robust basis for further modeling and enables us to ask relevant research questions that would finally inform impactful business strategies.

Methods for Exploring Data

Data cleaning/preprocessing

Raw data is never perfect upon collection, and data cleaning/preprocessing involves transforming raw data into a clean and usable format. This process may include handling missing values, correcting inconsistencies, normalizing or scaling numerical features, and encoding categorical variables. This ensures the accuracy and reliability of the data for subsequent analysis and ultimately for informed decision making.

Descriptive statistics

Statistical analysis usually begins with descriptive analysis, also known as descriptive statistics. Descriptive analysis provides data analysts and data scientists with an understanding of  distributions, central tendencies, and variability of the features. This lays the groundwork for future statistical inquiries. Many companies leverage the insights directly derived from descriptive statistics.

Basic visualization

Visualizations offer businesses a clear and concise way to understand their data. By representing data through graphs, charts, and plots, data analysts can quickly identify outliers,  trends, relationships, and patterns within datasets. Visualizations facilitate the communication of insights to stakeholders and support hypothesis testing. They provide an easy-to-follow visual context for understanding complex datasets. In essence, visualizations and descriptive statistics go hand in hand—they often offer complementary perspectives that improve the understanding and interpretation of data for effective decision making.

Formulating Research Questions

Data exploration plays an important role in formulating insightful research questions. By employing descriptive analysis and data visualization, data analysts can identify patterns, trends, and anomalies within the dataset. Then, this deeper understanding of variables and their relationships serves as the foundation for crafting more robust and insightful research inquiries. 

Moreover, data exploration aids in evaluating how suitable the statistical techniques are for a specific dataset. Through detailed examination, analysts ensure that the chosen methodologies align with the dataset’s characteristics. Thus, data exploration not only informs the formulation of research questions but also validates the analytical approach, thereby enhancing the credibility and validity of subsequent analyses.

Flatiron Offers Access, Merit, and Women Take Tech Scholarships
Make your career change into data science a reality.
Learn More

Exploratory Data Analysis Tools

EDA relies on powerful tools commonly used in data science.These tools offer robust functionalities for data manipulation, visualization, and analysis, making them essential for effective data exploration. Below, let’s explore some of the most common tools and their capabilities.

Python

Python’s Pandas, NumPy, Seaborn, and Matplotlib libraries greatly facilitate the process of data loading, cleaning, visualization, and analysis. Moreover, their user-friendly design attracts users of all skill levels. Python also seamlessly integrates with statistical modeling frameworks such as statsmodels and machine learning frameworks such as Scikit-learn.

This integration enables smooth transitions from data exploration to model development and evaluation in data science workflows. These Python libraries below are commonly used within the data exploration landscape:

  • Pandas: Facilitates data manipulation and analysis through dataframe and series structures, effortlessly enabling tasks such as data cleaning, transformation, and aggregation.
  • NumPy: Supports scientific computing for working with multi-dimensional arrays, which is essential for numerical operations and data manipulation.
  • Matplotlib: Matplotlib is a versatile library for creating professional visualizations, providing fine-grained control over plotting details and styles. 
  • Seaborn: Seaborn builds on Matplotlib to offer a higher-level interface, specifically designed for statistical graphics, which simplifies the creation of complex plots.
  • Plotly: Specializes in generating interactive visualizations, supports various chart types, and offers features like hover effects and zooming capabilities. 

R

R is tailored for statistical computing and graphics, and features versatile packages for data manipulation and sophisticated visualization tasks. Its extensive statistical functions and interactive environment excel in data exploration and analysis. Some of R’s key packages used for data exploration and visualization are as follows:

  • dplyr: Facilitates efficient and intuitive data manipulation tasks such as filtering, summarizing, arranging, and mutating dataframes during data exploration.
  • tidyr: Serves as a companion package to dplyr, focusing on data tidying such as reshaping data, separating and combining columns/rows, and handling missing values.
  • ggplot2: Known as a popular plotting system for creating complex, layered visualizations based on the grammar of graphics.
  • plotly: Provides an interface for creating interactive visualizations and embedding them in web applications or dashboards.
  • ggvis: Offers an interactive plotting package built on ggplot2, and provides plots that respond to user input or changes to data.

Gain an Education in Data Exploration in a Matter of Months
See what Flatiron’s Data Science Bootcamp can do for you and your career.
Learn More

Step-by-step Data Analysis

Let’s start on our practical data exploration journey with a real-life dataset that can easily be found online.

For the demo in this post, we are going to perform EDA on the infamous Titanic dataset. The dataset contains passenger information for the Titanic, with information such as the age, sex, passenger class, fare paid, and whether the passenger survived the sinking of the Titanic. We will be using the Python environment and we will rely on the power of highly robust Python libraries that facilitate data manipulation, visualization, and analysis.

We will now proceed with a step-by-step approach of discovering the hidden depths of this dataset.

Step1: Import libraries

Let’s start by importing the necessary libraries for data analysis and visualization. We’ll use Pandas for data manipulation, Numpy for numerical computing, and Seaborn and Matplotlib for visualization.

Step 2: Load the dataset

Let’s load the Titanic dataset, which is part of the Seaborn library’s built-in datasets. Next, let’s take a look at the first few rows of the dataset to understand its structure and contents. 

Step 3: Data exploration

Dataframe structure

Now let’s print a concise summary of the dataframe, displaying total number of entries, feature names (columns), counts of non-null values, and data types assigned to each column.

The Titanic dataset contains a total number of  891 rows. This comprehensive summary is helpful for verifying the absence of null values and ensuring that the data types were assigned correctly—which is necessary for precise analysis.

During the data exploration process, it is a very common practice to transform data types into a more usable format; however for this dataset, we will keep them as is. On the other hand, we see that some of its columns like age, deck, and embark_town have significant missing values. We will need to handle the null values for age later in our workflow since we will be using this variable in our exploration. 

Dataframe summary statistic

Let’s perform summary statistics on the numerical variables, which involves counts, means and standard deviations, quartiles, and minimums and maximums. This enables us to get an idea of the distribution or spread of numerical variables in a dataset, as well as pick out any possible outlying data points. For example, in the Titanic dataset, “fare” is a numerical variable referring to the money paid for tickets. It, however, covers a range of 0-512 with a median value of only 14.

In this case, it is reasonable to suspect that the presence of a value “0” might indicate errors. Also, there appear to be significant outliers on the higher end of the distribution. It would be good practice to investigate how the “fare” variable was coded further. However, we will not make any modifications to it in this exploration, as we will not use it in our analysis. 

Histogram insights

Now, let’s move onto generating histograms to portray the distributions of the numerical columns. Each vertical bar in a histogram shows the number of observations within an individual bin, while its height represents how frequently it occurs within that interval. Histograms significantly simplify the process of identifying patterns and trends present in our data as well as highlighting any anomalies. Therefore, these visualizations assist us in making decisions about how to clean up our data before proceeding to modeling, enhancing the accuracy and reliability of our analyses.

Step 4: Data cleaning/preprocessing

Identify outliers

Outliers can significantly influence the results of data analysis, and boxplots provide a helpful visual for their identification. The outliers that are present in this plot of “age” in the Titanic dataset are those points outside of the boxplot whiskers, indicating some large individual values of age. For this data exploration, we will leave these points as is; however, it is possible that their removal is required for valid and reliable results based on context and the type of analysis performed.

Handle missing values

Missing values are a common issue in datasets and can significantly impact the results of analyses by introducing bias or inaccuracies. Thereby, they need to be handled with care.

For example, for the “age” column in the Titanic dataset, we can apply several methods for the missing values. Simple approaches would involve removing rows with missing data or filling them with central tendency measures like mean, median, or mode.

Alternatively, more advanced methods such as predictive modeling can be used for imputation to fill in null values. These steps are vital for preserving data integrity and ensuring meaningful insights from subsequent analysis. In this scenario, we’ll just drop the rows with missing age values. 

Feature engineering

Let’s also explore feature engineering to further enhance our analysis. Feature engineering involves creating new features or modifying existing ones so that we can gain deeper insights from the data.

For instance, we can engineer a new feature such as “age groups” based on the “age” variable. This new variable will basically categorize age into five age groups: infant/toddler, child, teenager, adult, and senior. This will allow us to see survival rates across different age groups more clearly, and will provide deeper insight into the impact of age on survival. 

Step 5: Visualization

Now, let’s move onto visualizing relationships between some of the features within our dataset.

Survival rates by passenger class and sex: We’ll begin by creating a bar plot to examine survival rates based on passenger class and sex. Here, the x-axis represents the passenger class, while the y-axis illustrates the survival rate. We’ll employ the hue parameter to distinguish between male and female passengers.

This bar plot reveals compelling insights into the survival patterns as a function of passenger class and sex. Overall, first-class passengers survived at much higher rates compared to those in second and third class. However, female passengers consistently had higher survival rates than males across all passenger classes. The most notable difference between the male and female survival rates were among second class passengers. 

Remarkably, almost all first-class females survived with a rate of 96%, reminiscent of characters like Rose DeWitt Bukater from the movie Titanic. In contrast, the survival rate for third-class males, like the character Jack Dawson from the same movie, was sadly only at 15%.

In addition to the visual representation above, let’s create a table to display the mean survival rates for each sex and passenger class. This table could be further improved by including additional statistical features such as counts, ranges, and variability.

Survival rates by age groups and sex: Next, let’s create a bar plot to visualize survival rates across different age and sex, this time utilizing the newly engineered “age group” variable. This visualization will shed light on how survival rates differ among various age groups and between males and females.

The plot above reveals survival patterns based on age and sex. As with the previous plot, overall, female passengers survived at much higher rates than male passengers. However, the data also suggests that similar survival rates were observed for male and female passengers who were infants and toddlers. Nevertheless, as age increased, males tended to survive at increasingly lower rates while females tended to survive at increasingly higher rates.

This observation reflects the implementation of the “women and children first” principle. Notably, male children received a comparable level of priority to female children. However, despite this prioritization, males consistently faced lower survival rates compared to females, particularly more so as age advanced.

Step 6: Summary 

Through the systematic use of data exploration, descriptive statistics, and basic visualization, we’ve revealed valuable insights into the survival dynamics of the tragic voyage of Titanic.

Based on our analysis, we uncovered that passengers were more likely to survive if:

  • They held a higher class ticket.
  • They were female.
  • They were infants/toddlers regardless of sex.

Step 7: Next steps

There are many more analyses and insights to uncover in the Titanic dataset depending on your specific questions and interests. After completing data exploration, the next steps could involve hypothesis testing and advanced modeling. For instance, we might test hypotheses regarding the relative impact of other kinds of passenger characteristics on survival rates. 

Additionally, statistical or machine learning models can provide deeper insights into the most significant determinants of survival rates. For example, we could use logistic regression to predict survival based on features such as passenger age, sex, and class, and any possible interaction between them. Or we could apply a machine learning approach such as a random forest model to predict survival based on all available passenger characteristics.

Data Exploration: Conclusion

In summary, data exploration is an essential initial stage in any data analysis workflow. By systematically examining data using mathematical computations, statistical approaches, and visualizations, EDA reveals patterns, relationships, and insights in an iterative and interactive manner. It plays an important role in understanding and interpreting data, and shapes the trajectory of further analysis, ultimately leading to reliable data-driven insights. 
Flatiron School’s Data Science Bootcamp offers a fast path to an education in data exploration and exploratory data analysis. Book a call with our Admissions team today to learn more about our program and what it can do for your career. 

Unlocking Enhanced Security: The Zero Trust Approach

Zero Trust challenges the longstanding assumption of implicit trust within network environments. Unlike conventional models that rely on perimeter defenses, it operates on the principle of continuous verification, demanding strict identity authentication for every user and device seeking access to network resources, regardless of their location.

Imagine a castle with no safe haven; that’s the ethos of Zero Trust. In this dynamic landscape, potential threats exist both outside and within the network, necessitating constant vigilance. Every user identity and device integrity undergoes regular scrutiny, with connections timing out periodically, compelling re-verification. This proactive stance ensures that security remains a proactive endeavor, rather than a reactive afterthought.

Zero Trust Principles

At the heart of Zero Trust lies the principle of least privilege—access is not a right but a privilege, granted only to those who truly need it. This approach mirrors military operations, where information is disseminated strictly on a need-to-know basis. By minimizing user exposure to critical network segments, organizations can significantly reduce the risk of breaches and data leaks.

But Zero Trust doesn’t stop at user-centric controls; it extends its reach to device management as well. Rigorous protocols govern device authorization, monitoring the influx of new devices while assessing their security standing. By maintaining strict control over device access, organizations can shrink their attack surface and bolster network resilience.

Microsegmentation is another cornerstone. By dividing network perimeters into discrete zones, each operating autonomously, Zero Trust ensures that a breach in one segment doesn’t compromise the integrity of the entire network. This granular approach adds an extra layer of defense, confounding would-be intruders and limiting the scope of potential damage.

Flatiron Has Awarded Over $8.6 Million in Scholarships to Support Students in Their Education Pursuits

Learn More

Moreover, Zero Trust is designed to thwart lateral movement—the bane of conventional security models. By erecting virtual barriers between network segments, it impedes the progress of intruders, confining their activities to isolated pockets. In the event of a breach, swift containment measures isolate compromised elements, curtailing the spread of malicious activity.

Multi-Factor Authentication (MFA) is the final piece of the Zero Trust puzzle. By requiring multiple forms of identification, such as passwords and verification codes, MFA fortifies authentication processes, rendering them resilient to brute-force attacks.

3D Rendering abstract technological digital city from data in cyberspace, information storage in the information space

Navigating the Zero Trust Architecture: A Roadmap to Security Implementation

Zero Trust implementation is not without hurdles. The complexity of technology stacks, organizational silos, and budgetary constraints pose formidable challenges. Yet, amidst these obstacles, there is a roadmap to guide organizations along the Zero Trust journey, tailored to their unique needs and priorities.

Stage 1: Visualize

The initial step in the journey is to visualize the entirety of the digital landscape. This entails comprehending all resources, their access points, and associated risks. Key objectives include clarifying every entity—from identities and endpoints to workloads—and discerning vulnerabilities or risks inherent within them. Achieving this level of insight demands vigilance, especially when navigating diverse business units, mergers, and acquisitions scenarios, or multi-cloud environments.

Best practices in this phase involve bridging visibility gaps and scrutinizing all entities, regardless of their location or ownership. Threats against the entire identity store, encompassing protocols like Windows New Technology LAN Manager (NTLM) and relay attacks, must be vigilantly monitored to safeguard the entire ecosystem from compromise.

Stage 2: Mitigate

Having visualized the landscape, the next stage focuses on proactive threat mitigation and response. Real-time detection mechanisms, coupled with behavioral analytics, are deployed to thwart insider threats and credential takeovers. Segmentation and least privilege principles serve as bulwarks against breaches, curbing lateral movement within the environment.

Optimizing this phase entails minimizing operational overhead through identity-based segmentation and adopting a policy model that aligns with security and compliance needs seamlessly. Automation plays a pivotal role in expediting response times and bolstering cost efficiency, with policy actions seamlessly integrated into existing Security Orchestration, Automation, and Response (SOAR) workflows.

Stage 3: Optimize

The final leg of the journey revolves around extending protection across all facets of the IT infrastructure, without compromising user experience. Conditional access mechanisms, driven by risk-based factors, ensure continuous verification while mitigating MFA fatigue. Legacy systems are not exempt from scrutiny, as MFA protection is extended to safeguard the most vulnerable assets.

Key best practices at this juncture involve embracing risk-based, conditional access to mitigate user friction, particularly among privileged users. Detecting and responding to threats in public clouds and Single Sign On (SSO) credentials, even in the absence of deployed sensors or agents, underscores the importance of a comprehensive security posture.

The Zero Trust journey is a multi-faceted endeavor, demanding a strategic blend of technology, collaboration, and foresight. By heeding Flatiron’s recommendations and charting a course aligned with organizational imperatives, enterprises can traverse the Zero Trust landscape with confidence, fortified against the ever-present specter of cyber threats.

Navigating the Benefits and Drawbacks

Once a niche concept in cybersecurity, Zero Trust has now emerged as a standard in fortifying organizational defenses against an evolving threat landscape. While the benefits are compelling, the journey is not without its challenges. Let’s delve into both aspects to understand the intricacies of this transformative security archetype.

Zero trust benefits

Enhanced security

The foremost advantage of Zero Trust lies in its ability to elevate security postures to unprecedented levels. By embracing advanced security tools such as Identity and Access Management (IAM), MFA, and Extended Detection and Response (XDR), organizations fortify their defenses against a myriad of threats. Notably, the adoption of Zero Trust has led to notable improvements in Security Operations Center (SOC) efficiency, with automation capabilities streamlining detection and response mechanisms.

Simplifying security architecture and enhancing user experience

Zero Trust not only bolsters security but also streamlines organizational architectures, empowering security teams to respond to security events with agility. Through the consolidation of access for end-users, organizations can achieve a Zero Trust posture while ensuring a seamless user experience. The implementation of Secure Access Service Edge (SASE) or Zero Trust Network Access (ZTNA) architectures further augments user enablement, irrespective of device or location.

Adapting to remote work and cloud adoption

Amidst the paradigm shift towards remote work and cloud adoption, Zero Trust emerges as a beacon of resilience. By meticulously authenticating and continually verifying user identities, it mitigates the risks associated with decentralized environments. Also, its scalability ensures seamless connectivity for remote employees while safeguarding cloud infrastructures from unauthorized access.

Zero Trust Drawbacks

Scope of the initiative

One of the primary challenges in implementing Zero Trust lies in the vast scope of the initiative. Organizations must navigate a multitude of systems, applications, and data repositories, a task that often spans years. Prioritization becomes paramount to prevent overwhelm, with many organizations opting to start with specific use cases before gradually expanding their Zero Trust footprint.

Need for a strong identity system

Central to the efficacy of Zero Trust is a powerfully built identity system, a cornerstone in authentication and authorization processes. Identity and Access Management (IAM) tools play a pivotal role in verifying user and device identities, yet they remain prime targets for malicious actors. The resilience of an organization’s identity system directly influences the success of its Zero Trust implementation, underscoring the need for stringent security measures.

Remaining security risks

Despite its name, Zero Trust does not eliminate all security risks. Entities granted a degree of trust are susceptible to compromise, necessitating continuous vigilance. While it significantly enhances security postures, organizations must remain vigilant against emerging threats, ensuring the integrity of their security frameworks.

Zero Trust offers a compelling promise of enhanced security and resilience in an era defined by digital disruption. While challenges abound, organizations equipped with a strategic approach can navigate the Zero Trust landscape with confidence, forging a path towards a future fortified against cyber threats.

The Demand for Cybersecurity Professionals is Growing Rapidly

As of May 2024, the U.S. Bureau of Labor Statistics projects a 32% national growth for cybersecurity analysts through the year 2032. Flatiron School’s Cybersecurity Bootcamp can help you fast-track a career in cybersecurity in a matter of months. Begin the application process today or book a 10-minute call with our Admissions team to learn more. 

Software Engineering in the Age of AI

The landscape is shifting. The reality is that artificial intelligence (AI) is fundamentally altering everything—upending industries, redefining roles, and transforming how we approach everyday tasks like writing emails and touching up selfies. In the last three years, Generative AI models have advanced significantly, making tools like OpenAI’s ChatGPT accessible to just about everyone for optimizing workflows and enhancing productivity. This integration of AI across such a vast array of platforms signifies a new baseline for business operations and innovation. 

It’s hard to miss—almost every headline about tech concerns AI’s potential impact on the future. However, no one has a magic ball to predict the norm of the future. Executives don’t understand AI or their team’s proficiency with AI tools, so they are uncertain about how to implement AI in their organizations. Analysts and futurists are making educated guesses about the effects of AI. Some are predicting the automation of everything. Some are predicting the creation of a new era of human flourishing. It’s confusing, leaving us with significant uncertainty about the potential and limitations of AI technologies and the ways specific industries and jobs may change.

This article discussed the continued importance of software engineering in the AI era and how AI can complement and expand these skills in the coming years.

Is Software Engineering Still a Viable Career Path?

In short, yes. The tech industry is constantly changing and adapting. The creation of personal computers was a massive technological shift that was met with trepidation and concern and resulted in an enormous explosion in products and jobs. Frameworks, testing, and automation techniques have evolved for decades, creating significant productivity gains. The truth is that AI-assisted coding has been available to developers for years, and most of the potential gains of emerging technologies aren’t far out of line with the work that has happened in the past. 

Despite all of this, software engineering skills remain essential. The demand for skilled engineers is expected to grow by 25% in the next 5-6 years. That growth is driven by digital transformation and AI integration across all sectors. Software engineering is evolving to accommodate AI, necessitating a shift in skills while remaining foundational to the development of digital products. Its foundational pillars—programming, problem-solving, creativity, and complex system design—are as relevant as ever.

Programming Proficiency & Application Development

The fundamental role of coding in software engineering isn’t likely to change any time soon. Python and JavaScript are pivotal languages that every programmer will need to know. These languages support AI and ML projects and the frameworks that power modern applications. 

Python libraries, like TensorFlow, NumPy, Keras, and Scikit-learn, are foundational tools for AI and machine learning development. JavaScript has front-end and back-end development applications through frameworks like Node.js, Vue, and React, bringing AI capabilities to web interfaces. As AI integration deepens, the essence of coding as a skill—conceptualizing and creating digital solutions—will be invaluable. The development of future products will require deep programming and product development knowledge.

We teach these languages in most of our programs because of the popularity and versatility of Python and JavaScript, but they aren’t the only viable options. Languages like Java, PHP, and C# are also highly utilized in modern programs. Whatever language you learn, coding skills transcend specific languages; by learning to code, you learn problem-solving, system design, and adaptability. With AI tools automating tasks and generating code, software engineers can focus on higher-level problem-solving and creativity. This partnership with AI enhances efficiency and highlights the importance of programming knowledge. Engineers need to understand code to oversee AI’s contributions effectively, ensuring applications are efficient, scalable, and ethical.

Understanding AI and ML Principles

Engagement with AI projects is growing—a look at Github’s annual report shows a massive spike in AI-related projects. Developers are adapting to incorporate these new technologies in their toolkits. Software engineers must understand how to integrate AI into their projects, extending beyond traditional applications to include AI-driven functionalities like image recognition and language translation.

Knowledge of AI principles will be critical for addressing complex challenges. Not every engineer will need to be a data scientist, but familiarity with AI and ML concepts will become more essential with time. This knowledge is vital for software engineers in two ways:

  1. The ability to implement existing AI models. You must know how to use AI tools and incorporate them into products. For example, programming knowledge will help you interact with APIs, but you’ll also need to understand the model parameters and how to tune them to get the output you want. This takes some familiarity with AI concepts and a working knowledge of manipulating models for a desired outcome. Your knowledge of Python and development practices will be helpful here, as many of the most advanced AI and machine learning models are accessible via Python.
  2. Understanding how these technologies can be leveraged to solve real-world problems. This will soon become a real differentiator. Understanding models well enough to leverage them for specific circumstances will be critical in the future. Most of the recent discussion has been around Generative AI language models. Still, dozens of models exist for specialized purposes and work far better than ChatGPT for solving particular problems. For instance, we could implement a chatbot in a web application. What model should we use? Why that model? How can it be customized for the best user experience? These are the questions that developers will be asked in the future.

Creativity, Problem-solving, and Ethics

As AI becomes more embedded in software development and our everyday lives, the emphasis on ethical considerations and responsible use of AI will be magnified, and unique human skills such as creativity, empathy, and ethics will become more critical. AI can automate tasks, enhance workflow efficiencies, and augment the capabilities of software developers through tools like GitHub Copilot for code suggestions or automated testing and debugging tools. However, the essence of product design and development—understanding user needs and ethical implications, as well as ensuring accessibility—remain deeply human tasks that AI cannot replicate.

This evolving landscape necessitates a collaborative approach, requiring software engineers to work closely with data scientists, ethicists, and other stakeholders to ensure AI is used responsibly and beneficially.

Navigating the Future of Software Engineering with AI

Integrating AI into software engineering is a shift towards a more dynamic, efficient, and innovative approach to technology development. However, the human element is still as relevant today as it was 20 years ago. We may not know what the future holds, but we do know a few things:

  1. AI is reshaping all industries, not just tech. This means that technical skills will become increasingly important regardless of profession because you’ll need to work with these technologies whether you are a developer or not. Even if you move into another industry—retail, aerospace, medical, finance, etc.—all these industries will soon require some understanding of AI and the skills to work with it. 
  2. Coding is becoming table stakes for everyone. Many middle and high schools in the US already teach some basic coding to prepare learners for a future where all industries are more dependent on a tech-savvy workforce. Prompt engineering, software development, and communication skills will become more valuable over time, so getting a head-start by learning to code is always a solid career choice.
  3. The world needs intelligent, creative, and informed professionals to create the next generation of technologies. As AI technology becomes more accessible, one’s ability to use AI as a platform for innovation and advancement in all sectors will be the differentiating skill set. The reality is that companies are currently deciding how to optimize their workforces by augmenting current products with AI, but that won’t last long. 

Next-Gen AI Tools

The next generation of AI-powered tools and processes will enable the rapid development of new products and experiences. Efficiency gains may help companies in the short term by reducing costs. But, that effect will diminish significantly as product development cycles speed up. To stay competitive, companies must innovate and build products faster and at a higher quality. More products, more experiences, more competition. In the long run, AI will almost certainly create more technical jobs than it will displace. Still, future jobs will require workers to display high efficiency, communication skills, intelligence, and training in multiple technical domains.

Future Roles in Software Engineering with AI Integration

As AI becomes more intertwined with software engineering, new roles may emerge that will displace some traditional programming roles. New roles like Prompt Engineer, AI Quality Assurance Manager, and AI Ethics Officer are emerging and growing in response to the rapid adoption of AI into workflows and product solutions. These roles will also likely adapt with time, so we can’t expect to know the exact titles 5-10 years from now.

However, considering Generative AI’s known capabilities and limitations, we can speculate how it will impact software engineering roles.

  • Full-Stack Developer: Developers manage front-end and back-end systems. They write business logic, implement user experiences, and incorporate AI features to enhance user experiences and backend efficiencies. These developers will use languages like Python and JavaScript to develop full-stack products incorporating adaptive content and intelligent data systems. Understanding AI will enable these developers to create more responsive and smart applications.
  • Front-end Developer: Front-end developers create the interfaces we interact with every day. They create every page you see on the web with Javascript, HTML, and CSS and build applications using popular frameworks like React, Vue, and Svelt. Front-end developers can leverage user data to create personalized experiences, utilizing AI algorithms to tailor content and interfaces to individual preferences.
  • Back-end Developer: These developers create the server applications that talk to other systems and serve content to front-end applications. They build APIs, interact with databases, and make secure web applications by implementing authentication and validation. These developers will increasingly rely on AI for data processing and analysis, optimizing server-side operations, and enabling more sophisticated data-driven functionalities.

The Future is Bright

As AI continues to evolve, so will the roles and skills required in the field. Learning software development will give you many essential skills for the future. You’ll learn to code, work through complex problems, collaborate and communicate with stakeholders, work with AI tools, and start a lifelong growth journey.

Now is the time to embrace a life of continuous learning and ethical considerations that will be essential for those looking to lead the way in this new era. It’s never too late to start coding. We’ll see you at the keyboards!

The Data on Barbie, Greta Gerwig, and Best Director Snubs at the Oscars

When the 2024 Academy Award nominees were announced in late January, one of the most hotly discussed topics was that Greta Gerwig, director of Barbie, was not nominated for Best Director, despite the film being nominated for Best Picture. I assumed a Best Director nomination went hand-in-hand with a Best Picture nomination, so how common is it for a film to be nominated for Best Picture, but not Best Director? It turns out, fairly often, at least since 2009.

50 years of Best Picture and Best Director Oscar nominations
The chart above comes from Flatiron’s analysis of over 50 years of Best Picture and Best Director Oscar nominations. Films that win these two awards are often nominated in both categories.

From 1970 to 2008, the Best Picture and Best Director categories had five nominees each. It was common to see four of the five Best Picture nominees also receiving a nomination for Best Director. And in 32 of these 39 years, the film that won Best Picture also won Best Director.

In 2009, the Best Picture nomination limit increased to 10 films. Best Director remained capped at five, so naturally, this resulted in more Best Director snubs than before. In terms of winners, the larger pool of Best Picture nominees seems to be aiding in separating the two awards. Best Picture and Best Director Oscars have gone to two different films in six of the last 14 years (this happened only seven times in the 39 years before 2009).

Barbenheimer

Although it’s no longer uncommon for a film to receive a Best Picture nomination without one for Best DIrector, Barbie wasn’t just any film. Barbie was one half of the cultural phenomenon known as Barbenheimer. A mashup of two highly anticipated and starkly different films—Barbie, and director Christopher Nolan’s historical biopic Oppenheimer—both hit theaters on July 21, 2023. The goal of seeing both films back-to-back became one of the defining characteristics of the Barbenheimer phenomenon. While both films were hugely successful at the domestic and international box office, Barbie out-grossed Oppenheimer by an estimated half-billion dollars worldwide.

The two films dominated the zeitgeist for much of 2023 and both received enormous critical acclaim. Oppenheimer has dominated this awards season, however, with 13 Oscar nominations garnered and multiple important wins at other film awards ceremonies leading up to the Academy Awards on March 10.

We’ll return to how we think about “importance” in the context of nominations, but for now, let’s compare the two films along the lines of major award ceremonies, ratings, and box office revenue.

Barbie vs Oppenheimer

analysis comparing Barbie and Oppenheimer performance by major awards
The graphic above comes from our analysis comparing Barbie and Oppenheimer. Both films have numerous award nominations and have brought in over two billion dollars combined.

Minus its take at the People’s Choice Awards, Oppenheimer has taken home more awards overall, despite having a similar number of nominations at most award shows. Barbie appeared to be on a roll this award season, with nominations for picture, director, screenplay, actress, and supporting actor at the Golden Globes and Critics Choice Awards in early January. However, Greta Gerwig was left out of the director category when the Oscar nominees were announced on January 23. This leads to the question, what films are most similar to Barbie, not just by nomination count, but across major categories? And were those films nominated for Best Director?

Movies Like Barbie

We began our Best Director snubs analysis at Flatiron by collecting all past nominees across the entire history of the awards ceremonies noted in the image above—swapping out the People’s Choice Awards for the Writers Guild Awards—for a comprehensive dataset of non-fan nominations. We also merged categories like Best Adapted Screenplay and Best Original Screenplay into one screenplay category for ease of comparison. Similarly, we lumped all acting categories–male, female, lead, and supporting–into one, and all Best Picture categories into one if split into drama and comedy/musical categories (like the Golden Globes does).

With a dataset of over 3,000 nominees going back to the 1920s, we found films most similar to Barbie across our grouped screenplay, grouped actor(s), director, and picture categories using Euclidean distance, a method for finding the distance between two data points. The five films below are the most similar to Barbie according to the awards and groupings we’ve selected. Interestingly, these five films, including Gerwig’s 2017 debut film, Lady Bird, all received a Best Director nomination at the Oscars (while Gerwig’s directing work on Barbie did not).

comparing barbie's nominations to other high-performing movies from previous award seasons

Predicting Best Director Snubs at the Oscars

A sample size of five is certainly not enough evidence to make a definitive claim of a snub, so we developed a predictive model that classifies a film as a Best Director nominee based on the other nominations it received, either at the Oscars or previous award shows. Our final model achieved 91% accuracy. For the astute reader, it also reached 93% precision and 96% recall. 

Based on films from 1927 to 2022, the best predictor of a Best Director nomination at the Oscars is a Best Picture nomination at the Oscars. This isn’t surprising, considering the overlap in nominees that we observed in the first image at the top of the article.

Other notable predictors are Best Screenplay at the Oscars or Critics Choice Awards, and Best Director at the Golden Globes or Director’s Guild Awards (DGA). These predictors align with intuition, given the importance of a good script and how common it is to have a filmmaker with the title of writer/director. In the case of the DGA, it’s hard to think of a more qualified group to identify the best directors of the year than the 19,000-plus directors who make up the guild’s membership 

Trained Model Predictions

Finally, using our trained model, we applied it to our list of 2023 films that received at least one nomination in a screenplay, acting, directing, or picture category. Given the long list of accolades received by Barbie at the Golden Globes, Critics Choice Awards, British Academy Film Awards (BAFTA), and all the filmmaking guild awards, our model predicted Greta Gerwig to have a 76% chance of snagging a Best Director nomination. Considering she was in third, just behind Christopher Nolan for Oppenheimer and Yorgos Lanthimos for Poor Things, I’d call this a snub. (Gerwig tied for third with Justine Triet for Anatomy of a Fall.)

which best director nominations were predicted by a trained model

Best Director Snubs and Flatiron’s Analysis

Rank-ordering the predicted probability of receiving the directorial nomination, the 2017 film Three Billboards Outside Ebbing, Missouri by writer/director Martin McDonagh was our model’s biggest snub. A film that initially received wide acclaim, it later faced criticism over its portrayal of misogyny and racism. Coincidentally, Greta Gerwig was one of the five directors nominees that year alongside Guillermo del Toro, Christopher Nolan, Jordan Peele, and Paul Thomas Anderson—a star-studded list of filmmakers if ever there was one. 

the biggest "best director" snubs over the last 25 years
The table above shows where our model was highly confident—but ultimately, incorrect—that a film would receive the Best Director nod.

It’s worth noting that many of the films listed in our table above also appear in a recent Variety article that ranked the biggest Best Director snubs over the last 25 years. While the writer of the Variety article does not discuss his methodology, it’s always a good idea in data science to validate findings with subject matter experts. In the case of our analysis and the Variety article analysis, there seems to be some agreement. 

Final Thoughts

As with all predictive models, our model is only as good as the data it learns from. A common criticism of the Academy is its lack of nominating women and people of color across categories, particularly for Best Director. Mitigating bias and ensuring fairness in predictive models are important concepts in Big Data Ethics, but we’ll save the ways one could address these issues for another post.

Learn Data Science at Flatiron School

Data analyst is just one of the career paths you can embark on after graduating from Flatiron’s Data Science Bootcamp. Our bootcamp offers students the opportunity to graduate and begin working in the field in as little as 15 weeks. Download the course syllabus for free to see what you can learn!

Header photo courtesy of Warner Bros. Pictures

Taylor Swift and Data Science: An Unlikely Duo

Data is everywhere, but one thing that might be more ubiquitous than data is Taylor Swift. The recent article “Taylor’s Towering Year”—authored by Posit (formerly RStudio)—illustrates several ways in which the two are not mutually exclusive by showing the data behind her record-breaking Eras Tour. In the article, they break down the tour’s staggering ticket sales, profound effect on worldwide economies, and boost in popularity for Taylor’s opening acts. Let’s discuss how Posit accomplished this and show you a concert tour visualization of our own.

Quarto

First released in early 2021, Quarto, the tool behind the Eras Tour article, is an open-source publishing system designed to weave prose and code output into dynamic documents, presentations, dashboards, and more. Paired with a variety of ways to publish and share your content, it is an excellent platform for data storytelling. 

Deciding to learn R vs. Python is a well-covered topic and often one prone to heated debate. In Quarto, there’s no “Bad Blood” between the two popular programming languages, where you can choose to run your project in R, Python, or both. It’s also compatible with the Julia and Observable JS languages as well as many of the most popular integrated development environments (IDEs) used in the field of data science, like VS Code, Jupyter, and RStudio. This flexibility means data scientists can collaborate on projects using the tools of their choice. 

How Quarto Generated the Eras Tour Data

Notice the See the code in R link in the left sidebar of Posit’s article that takes you to a virtually identical page. The key difference is this page allows you to see the code behind the data collection and visualizations. We won’t go line-by-line, but let’s look at the high-level steps they took to craft the “GDP of Taylor” data visualization toward the top of the article.

Data Collection

Expand the “See R code” section just above “The GDP of Taylor” visualization to see the first code chunk where Posit starts by web scraping the Wikipedia page for nominal GDP by country. Web scraping is a technique in which you write code to visit a website and return information or data. Be sure to read the terms and conditions of a website found in the robots.txt file that tells you what information you may scrape. 

Data Cleaning

Since Taylor was estimated to stimulate the economy by over $6 billion, the collected data is filtered to countries with GDPs between $4 and $10 billion for comparisons of similar magnitude. Next, Posit plots the map and GDP of each of those eight countries using the R library, ggplot2. Lastly, they stitch everything together with Taylor’s image and economic impact in the center using the cowplot library. Starting with several discrete plots and organizing them together, they are able to create an infographic that puts the Eras Tour in shocking perspective.

This is a great example of data science in action. As data scientists we’re often asked questions or have hypotheses but are not handed a tidy dataset. Instead, we must connect to an API or find data online, automate the process of collecting it, and manipulate it into a format that will be conducive to our analysis. Data collection and cleaning are often the iceberg below the surface while visualizations and predictive models are the parts everyone can see. Without good data, it’s incredibly difficult to produce insightful analyses.

Flatiron’s Highest-Grossing Concert Tours Data Visualization

Like Posit, we collected the data from the List of highest-grossing concert tours page on Wikipedia. Instead of a static chart, we created a bar chart race—a fun way to visualize data changing over time using animation. Below we have the highest single-year tours by gross revenue from 1993 to 2023. 

A gif showing the highest-grossing musical tours by year.

The Rolling Stones and U2 tours held most of the top five spots for a majority of the past 30 years. That is, until the 2023 Eras Tour nearly doubled the $617 million grossed by the A Bigger Bang Tour—the 17-year record-holder set by the Stones in 2006. Interestingly, Taylor Swift is the first female solo artist to crack the list since Madonna’s The MDNA Tour in 2012. With the Eras Tour projected to bring in another $1 billion in 2024, Taylor Swift may take the top two spots come end of year.

This analysis was originally created in our own internal Quarto project at Flatiron School and copied over here onto our blog. Give Quarto a try and you might just tell Jupyter notebooks and RMarkdown, “We Are Never Ever Getting Back Together.”

Header image credited to Posit

What Does a Cybersecurity Engineer Do?

Technology permeates almost every aspect of modern life – payment systems, fitness trackers, baby monitors, smart fridges – it’s simply everywhere. As a result, the modern digital landscape is fraught with opportunities for bad actors to take advantage of tech meant to make life easier.  

The ongoing threat of cyber attacks has led governments and private companies to employ increasingly sophisticated teams and techniques to secure their systems. But, in the field of cybersecurity, few roles are more central – or critical – than that of the Cybersecurity Engineer. 

What Is A Cybersecurity Engineer?

Cybersecurity Engineers are IT professionals that assess, prepare for, and respond to cyber-attacks. They often directly recommend and oversee the tools and methods used to keep systems secure and stay ahead of new or emerging threats such as malware, ransomware, and hacking/intrusions.

With a diverse skillset and deep understanding of a company’s networks, Cybersecurity Engineers are often well-situated to work in any number of entry-level, mid-level, and senior security roles as their careers advance. 

What Skills Does a Cybersecurity Engineer Need?

Security engineers must have a range of talents that address the most common and critical vulnerabilities a company may face.

Programming Langauges

Cybersecurity engineers are strong programmers in multiple languages and use them not only to evaluate or build the latest solutions but also to deconstruct the malicious codes and applications used to attack their systems. And because their efforts must fit the nature of their business, they need to be prepared to work in general-purpose languages like Python or C++, but also in languages specific to web or application work.

Penetration Testing

A cybersecurity engineer must be able to assess penetration testing and work with pen testers on new threats that have emerged in the industry.

Live Threat Management

They are critical stakeholders in real-time assessment, mitigation, and prevention of threats, as well as in live incident response and defense. They may also get pulled into troubleshooting everyday problems in the network and be asked to help drive broader company initiatives for a healthy network, such as change management, security audits, etc.

Strong Organization Skills

Critically, as members of a Security Operations Center (SOC) team, cybersecurity engineers are typically responsible for documenting and maintaining policies and procedures related to security concerns. This gives them a strong view of a network’s structure and defense, from the top down to the most granular level.

Hardware and Software Evaluation

They also need to be able to review new hardware and software options for ensuring network security and to understand specific threats their company may face from hackers.

Related reading: How To Become A Cybersecurity Engineer

What Is The Demand For Cybersecurity Engineers?

The Cybersecurity talent gap is well documented. There simply aren’t enough talented Cyber workers to fill available positions, with almost 580,000 job openings in the USA alone*. Demand is high, and growing – with the information sector projected to grow 32% over the next 8 years (which is much faster than average). 

The prevalence of tech throughout everyday life is only expanding – each day bringing more products, services, and innovations. This, paired with the persistent headlines of cyber incidents affecting countries, companies, and individuals each news cycle, the Cybersecurity talent gap only promises to expand. This can mean only one thing for those interested in a Cybersecurity career: opportunity. 

Cybersecurity Engineer Salaries

Overall, Cybersecurity professionals are paid quite well. According to ZipRecruiter, the average Cybersecurity Engineer makes $111,095* per year nationwide. 

Further reading: How Much Do Cybersecurity Engineers Make Around the Country?

As you progress through a cyber career, salaries increase even further: 

  • Chief Security Architect: $177,571 
  • Lead Security Engineer: $122,890 
  • Chief Information Security Officer (CISO): $148,746 

If the C-suite isn’t your idea of success, Cybersecurity engineers can also easily fit into alternative key focus areas. For example, they can use their knowledge of a firm’s networks to operate as a high-value penetration tester. They can also become a cybersecurity consultant, working with clients on a case-to-case basis. 

Considering the high demand and the quality pay, the next question you’re likely to ask is “How to become a Cybersecurity Engineer?” Well then friend, keep reading. 

How To Become A Cybersecurity Engineer

There are two main steps to becoming a Cybersecurity Engineer – acquiring skills and demonstrating those skills with a Certification. 

Did you notice that “get a traditional 4-year college degree” wasn’t on the list?

More and more often, 4-year degrees are no longer a pre-requisite for entry-level Cybersecurity roles. While this is still an evolving trend, at the end of the day, the only thing companies truly care about is whether or not you have the skills to perform in a job. 

We’ll cover the basics of the process below, but check out our blog “How To Become A Cybersecurity Engineer in 2024” for a more thorough overview. 

Step 1: Acquire Skills

Cybersecurity Engineers generally have a mix of hard skills and soft skills. Their hard skills include various programming languages and a broad knowledge of networking, database platforms, and operating systems. Soft skills, though intangible, are just as important when it comes to working on a team – these can include top-notch communication, problem-solving, and a general willingness to learn and adapt to changing conditions. 

While soft skills are developed over an entire lifetime and decades of a career, hard skills are easier to acquire in a short time – especially if attending an accelerated program like a Cybersecurity bootcamp.

Related reading: Are Cybersecurity Bootcamps Worth It?

Step 2: Earn Relevant Certifications

Cybersecurity Certifications generally achieve a similar purpose to earning a college degree – they demonstrate to prospective employers that you have the core hard skills required to succeed in the cybersecurity engineering space. 

Examples of common certifications include:

Related reading: 9 In-Demand Cybersecurity Certifications

See Yourself In Cybersecurity

Cybersecurity engineering promises to remain a critical and exciting career opportunity for the foreseeable future, and the ongoing talent job gap bodes well for those looking to transition to the field. 

If you’re set on a career in Cyber, Apply Today to Flatiron School’s Cybersecurity Bootcamp – a program designed to have you industry-ready in as little as 15 weeks. 

Not ready to commit? In the meantime, test-run what we teach in the course with our free Cybersecurity Prep – no strings attached!

*Job opening numbers and average salary cited 8 Jan 2024

How Much Do Coding Bootcamps Cost in 2023?

Coding bootcamps have grown in popularity in recent years, offering an accelerated path to acquire the skills needed to pivot into a new tech career. But, what exactly are coding bootcamps? Who do they help? And – most importantly – how much do coding bootcamps cost?

What Is A Coding Bootcamp?

Put simply, coding bootcamps are intensive educational programs that rapidly teach students the programming skills needed to become full-stack web developers.

At the end of a bootcamp, students walk away with coding experience in programming languages like JavaScript, HTML and CSS, Ruby, and a strong portfolio of projects.

As for who attends bootcamps, there really is no “typical” student. They may be career changers coming from a non-technical background who want to break into tech, professionals looking to deepen their technical skills, or new participants in the workforce taking an alternative avenue to traditional university schooling. No matter where our students come from, attending a coding bootcamp helps them develop an in-demand technical skillset. 

How Much Do Coding Bootcamps Cost In 2023?

Bootcamps are more than just one-off courses — they’re intensive, often fully immersive programs. But how much does a coding bootcamp cost?

According to Career Karma, the average cost of a coding bootcamp in 2023 was $13,035. Program cost depends partially on format, with full-time options averaging $14,237 and part-time options coming in at $12,226.

‌‌Tuition At Flatiron School

Flatiron School’s tuition varies by discipline – $17,900 for Software Engineering, and $16,900 for Cybersecurity Engineering, Data Science, and Product Design. While tuition represents a significant investment, eligible applicants can apply for scholarships (Merit, Access, Women Take Tech), and may qualify for loan options to fund their education. 

While tuition certainly is a big factor to consider when choosing a further education program, we’d encourage prospective students to make a decision based on a multitude of reasons – in addition to the price tag. 

Factors to consider include:

  • Curriculum Quality. Is it reviewed by industry experts? Does it include up-to-date emerging technologies like Artificial Intelligence? 
  •  Instructor Experience and Availability. Will you learn from knowledgeable, experienced professionals? Will there be continuous support throughout the learning process?
  • Program Flexibility. Is a program entirely online or remote? Do you value the in-person classroom experience, or would you prefer to log in from anywhere? 
  • Pacing Options. Can you commit to a full-time, 40 hours a week learning schedule? Or do you need more flexibility in your life? 
  • Academic Support. Are there technical coaches there to help you through a learning challenge? Or, is there an advising team to turn to if you’re struggling to adapt to the program’s demands? 
  • Career Services Support. Flatiron School graduates receive up to 180 days of 1:1 career coaching to help launch their career – support that it’s difficult to put a price tag on (but realistically, would likely cost thousands of dollars for a similar service).

When you select a bootcamp, you’re signing up for a path to the future. Choose a program that will give you what you need to succeed – in-demand skills, hands-on learning, and support both during the program and after graduation.

Coding Bootcamps vs. College‌

We mentioned that coding bootcamps were becoming an increasingly popular means of getting a technical education. However, you might wonder how they stack up against getting a computer science degree, especially once you consider the tuition of a bootcamp may roughly equal to a semester of college.

‌Some of the content may be similar, but there are fundamental differences between these two avenues, and we’ll cover some of them here.

1. Time Commitment

Most full-time students spend at least four years in pursuit of a bachelor’s degree, where they attend classes for several hours each day and commit several more in the evenings to studying and completing homework assignments.

Full-time bootcamps, on the other hand, focus on accelerated learning, which means that most of them run for 12 to 15 weeks, though that time frame can vary based on the curriculum and pacing (a part-time bootcamp will run longer).

‌Bootcamps frequently have rigid schedules, though if you need more of a part-time option, Flatiron School offers flexible pace options where a student may choose to pace themselves through the curriculum over up to 60 weeks. This is helpful for people who have other time commitments, such as a family or career outside of their education.

2. Financial Investment

There’s no comparison between the average cost of a bootcamp and the cost of a bachelor’s degree. ‌

The average price for a year in college for the 2023-2024 school year can be north of $29,150, with top-rated technology schools such as Carnegie Mellon and Massachusetts Institute of Technology (MIT) coming in at above $60,000 per year. On the other hand, even the most expensive bootcamp tops out at $30,000, though the vast majority are in the $10,000–$20,000 range. ‌

How To Pay For A Coding Bootcamp

The most straightforward way to ]pay for your education is to pay upfront. If you have the means, the advantage here is that you don’t have to think about it anymore afterward, and you can just focus on your studies.

‌Of course, not everyone can foot the bill all at once. There are several alternative ways to pay for bootcamps:

  • Scholarships. Many coding bootcamps offer scholarships that students may apply for. These are often awarded based on merit or diversity initiatives — such as scholarships for women or underrepresented minorities. You can also check for any local resources that might offer scholarships.
  • Educational loan‌s. While there isn’t much federal aid available for coding bootcamp tuition, the good news is that several private lenders offer financing to bootcamp students.
  • Payment plan‌s. Some bootcamps offer the option to pay for your accelerated learning in installments. Of course, the terms of your payment plan will differ from school to school.
  • Employer assistance. Did you know employers may offer tuition assistance? Ask your employer about their tuition assistance benefits!

What Salary Can I Expect After Graduating From A Coding Bootcamp?

At the end of the day, what really matters when selecting a coding bootcamp is the outcome – the likelihood of getting a software engineering or similar job post-graduation. While no program can guarantee a job post-graduation (unless they’re giving you one), it is helpful to look at previous graduates’ job placement rates to get an idea of what to expect.

Job Search Support

The best coding bootcamps support their students’ job searches post-graduation, and transparently report the results of those searches each year. Unless the program is new and you’re willing to take a risk, you’ll want to steer clear of bootcamps that either have low rates of job placement or have no data available. It’s a good idea to ask for a third-party verified outcomes report so that you have a clear picture of what you might expect upon successful completion of the program.‌

Flatiron School is proud to be the first coding bootcamp to put out job placement statistics that were examined by an independent third party. And, with our robust career services that offer up to 180 days of career coaching post-graduation, we’re proud to say that in our 2022 jobs report, 90% of graduates* got a job in tech.

*For job-seeking full-time and part-time graduates included in the 2022 Jobs Report, including full-time salaried; full-time contract, internship, apprenticeship, and freelance roles; and part-time roles during the reporting period. (See full Jobs Report here.)

Reported Graduate Salaries

Flatiron grads aren’t just getting tech jobs — they’re being paid well too.

Graduates of the Software Engineering program at Flatiron School boast average starting salaries higher than the national average. See the most recent jobs report for complete details.

Are Coding Bootcamps Worth It?

Coding bootcamps are phenomenal programs for developing new skills, but they aren’t for everyone.

‌Whether or not a coding bootcamp will be worthwhile for you is highly dependent on your goals and your ability to commit to the workload. However, below are some benefits and disadvantages of attending one. ‌

Advantages Of Attending A Bootcamp:

  • Develop skills quickly. For those looking to pivot to a career in tech, four years and thousands of dollars in tuition for a college degree might be too big an investment. Bootcamps teach you real-world job skills for a fraction of the cost, and in a fraction of the time.
  • Options to study while working. Many bootcamp students are already professionals in another field. They want to make a transition into a specific career — whether that be software engineering, data science, product design, or cybersecurity — but don’t have time or desire to put their current job on hold to pursue further education. The flexibility of a bootcamp makes that a possibility. 
  • Affordable tuition. While coding bootcamps do represent a sizable investment, they are much more affordable than college degrees. Even the priciest bootcamps cost less than a semester’s tuition at America’s top tech schools.
  • Career coaching included. Many top bootcamps offer career services and job search assistance to recent graduates. These offerings include anything from accountability check-ins to materials review and mock interviews. Part of landing a tech job is being prepared for the technical interview, and career coaches can make all the difference to job seekers.
  • Networking opportunities. A robust professional network is critical for any job seekers, and the good news is that bootcamp grads have one baked into their programs in the form of classmates entering a new industry along-side you.
  • ‌Develop in-demand skills‌. Where university programs might hold onto curricula that include legacy languages and out-of-date development approaches, bootcamps teach the most current and in-demand paradigms and emerging technologies (such as artificial intelligence). Starting and finishing your program within a year — or even a few months — means that there’s virtually no danger of your skills becoming outdated before you finish.

Disadvantages Of Attending A Bootcamp

  • Following a tuition payment schedule. Coding bootcamp tuition is paid either upfront or over a much shorter period than federal student loans – 5 years vs. 10 years. But, as a bootcamp tuition is far lower than even a year of attendance at many schools, the overall amount paid over the shorter time period will inevitably be lower.
  • Fast-paced learning schedule. Bootcamp programs are fast-paced, so students need to be prepared to study and work on projects for long hours to succeed.
  • Some employers prefer degree holders. There are still HR managers out there who expect candidates to have a college degree. However, many in the industry believe that bootcamp graduation is a respectable means to developing the skills required to get started in a rewarding career. 

‌No matter where you get your start, it’s important to keep learning in tech as the field evolves. Starting with a bootcamp and later going for a degree can be a viable path to advancement, though it’s just as likely – especially as opinions on university shifts – that you can learn the skills you need to move up the ladder both on the job and through learning on your own.

Ready To Get Started Learning To Code?

We covered a lot in this article, and we hope that you’re better equipped to do your research on the various bootcamps and educational paths that are available today.

But, whichever programs you’re considering, we’d encourage you to look at alumni stories and see if you can relate to their journeys. No matter where you come from or what technical level you’re starting from, a coding bootcamp can get you one step closer to your dream career. Some of our recent alumni’s journeys are below:

Apply Now to join other career changers just like these in a program that sets you apart from the competition. 

Need more time to be ready to apply? Try out our Free Software Engineering Prep and test-run the material we teach in the course.

Read more stories about successful career changes on the Flatiron School blog.

11 Best Websites to Practice Coding for Beginners in 2024

Indeed’s Best Jobs of 2023 ranked America’s most highly prized careers based on demand, pay, and potential for growth. These careers included:

  • Full-stack developer (#1)
  • Data engineer (#2)
  • Back end developer (#6)
  • Site reliability engineer (#7)
  • Director of data science (#25)

That means half of the top ten best jobs in America require coding skills‌. But, if you don’t have coding skills, where do you go to learn them?

At Flatiron School, we help pave the way for your transition into the tech industry. Our comprehensive programs are designed to provide you with the foundational knowledge you need to develop the coding and programming skills that are in such high demand. Our programs help students discover their full potential and pursue the career of their dreams.

But maybe you’re just ready to dip your toe in and explore your options? There are several coding for beginners resources online.  
So whether you’re looking to learn a new coding language or try out programming for the first time, it’s worth looking into coding practice sites for beginners.

11 Best Coding Practice Sites

Coding is not a spectator sport. It’s great to watch tutorials and read books on how to code, but to truly develop your programming proficiency, you must write the language yourself. Here are some of the best places to practice your coding skills.

1. Coderbyte

As you begin to develop your coding skills, you may be unsure what to practice first. It helps to work on real-world problems other coding professionals have faced—and Coderbyte has exactly that. With over 2,000 challenges on front and back end development, data structures, and algorithms that professionals have faced in their interviews, you’ll be able to hone your skills on examples that really matter.

Pros of Coderbyte

‌Coderbyte lets you use over 30 different programming languages and has a library of over 3 million solutions you can learn from.

Cons of Coderbyte

Coderbyte has a user interface that’s often complex for new users to navigate, so it might take some time getting used to this site.

2. Pluralsight

If you’re seeking a comprehensive learning platform that allows you to learn at your own pace, go with Pluralsight. You’ll develop a strong skill set in Python, JavaScript, HTML, and CSS, just to name a few. You can also receive learning recommendations based on what you’re focusing on.

Pros of Pluralsight

The platform provides a personalized learning experience as you can customize your training sessions with multiple features, including multiple language support.

Cons of Pluralsight

Users have minimal interaction with course instructors and industry experts, which makes it difficult to gain valuable feedback.   

3. Edabit

Unlock over 10,000 interactive coding challenges with Edabit. This free platform provides courses that are simple and practical. You can also access beginner tutorials to make the most of your learning experience. Plus, the challenges are ranked by difficulty, so you know exactly what level you’re at in your programming expertise.

Pros of Edabit

Learners gain access to a code editor that’s built into the platform. That way, users can create a code without having to switch to another application.

Cons of Edabit

You don’t gain a certificate for completing any of the tutorials. Also, some users have stated that the platform used outdated programming languages. 

4. CodinGame

Wanna play a game? CodinGame lets you practice your coding through fun games and code challenges. With single-round matches and both solo and multiplayer modes, this platform gives you a chance to practice coding the fun way.  

Pros of CodinGame

There are over 25 available programming languages.

Cons of CodinGame

Some users take issue with the size of the timers displayed on the programming tests. Unless you look carefully for the timer, you may miss the opportunity to submit your questions on time. 

5. CodeChef

Competition can be the best motivator to learn, and CodeChef offers exactly that. This platform lets users measure their skills by practicing more than 3,000 problems. You can compete against other coders, which creates great coding practice for beginners. But don’t worry—the competition is friendly, and participants often write posts and tutorials to help each other learn.

Pros of CodeChef

CodeChef users will be pleased to know that there is an active, supportive community that encourages growth. 

Cons of CodeChef

Users report that some practice problems lack clarity.

6. Project Euler

Project Euler offers a chance to solve challenging math problems with script. Over 1 million users have solved at least one problem on the site.

Pros of Project Euler

The site offers 111 programming languages.

Cons of Project Euler

If math isn’t your strongest subject, Project Euler may not be for you. The math-oriented programming languages get quite complex as you progress through the challenges.

7. TopCoder

Join a community of 1.7 million technical experts at TopCoder. On the learning side, they have an abundance of weekly challenges and explanations, along with challenging competitions that help you rise to the coding occasion.

Pros of TopCoder

The site is one of the most established platforms with an active user base.

Cons of TopCoder

Some users have experienced difficulty navigating the program’s user interface. Also, if you submit a support request, it may go unnoticed as their support system lacks efficiency.

8. One Month

Learning coding and web development in a span of 30 days with One Month. You can select from a variety of basic courses that cover HTML, Python, SQL, Ruby, and more! You also get to create real-world projects.

Pros of One Month

This user-friendly resource is great if you’re just looking to learn the fundamentals of coding and web development.

Cons of One Month

If you decide to switch to another programming platform, just note that there are no refunds available.

9. Geektastic

With detailed solutions to their multiple-choice and peer-reviewed coding challenges, Geektastic has a wealth of resources for programmers and a growing community of over 26,000 developers.

Pros of Geektastic

In addition to their interactive challenges and competitions, coders that rank high enough might even be allowed to join the review team. Members of this team get paid to review coding submissions for clients seeking a solution to their own coding projects.

Cons of Geektastic

Some users have raised concerns with how the challenges are scored, stating that they don’t reflect a candidate’s programming skills well. 

10. Geeks for Geeks

Made by developers for developers, Geeks for Geeks offers coding content for programmers of all skill levels, including beginners. Exercises in data structures, machine learning, web development, and much more are available.

Pros of Geeks for Geeks

Competitive challenges offer interactivity and a space to share coding solutions.

Cons of Geeks for Geeks

Geeks for Geeks primarily provides information in English, which means non-English speakers could run into trouble with the resources available. The website also has limited multimedia resources, mostly relying on text-based information.  

11. HackerEarth

What’s great about HackerEarth is that users can create and customize their coding assessments for technical positions. 

With HackerEarth, you’ll join a community of 7.6 million developers, participate in several programming challenges and customize your tests for a specific role.

Pros of HackerEarth

Not only does the website offer support in multiple languages, but it also includes AI proctoring to ensure exam results are accurate and reliable. 

Cons of HackerEarth

You might run into trouble navigating to specific problem types because the website has troublesome indexing and prioritization. 

Blogs to Help You Learn

They may be less interactive than competitions and online courses, but plenty of coding for beginners blogs are available to help new students gain programming proficiency. Here are some of our favorites.

1. The Crazy Programmer

This blog isn’t designed to give hands-on coding experience, but there’s a wealth of programming knowledge on pretty much everything else. From useful books and articles to tutorials and Q&As, The Crazy Programmer is a great blog to follow for those just learning to code.

2. The Hackr.io Blog

If you’re looking for courses or content that will grow your coding skills, The Hackr.io Blog will help you look in the right places. Written by a community of programming professionals, this blog is devoted to grading the most useful coding content so readers know they’re relying on quality sources. 

The blog touches on a wide range of topics, though, so those looking for resources on a specific language may find their content hit and miss.

3. Better Programming

As its name suggests, this blog is devoted to improving your programming. With posts on a range of topics in web design and coding, Better Programming features content from multiple industry pros on both introductory and advanced content. There’s truly something for everyone. As with Hackr.io, those concentrating on a specific topic may want something more focused.

Flatiron School: The Ultimate Coding Solution

What makes us different from coding websites? Here at Flatiron School, we work tirelessly to help students gain the foundational coding skills they need to begin a career in the tech industry. Combining flexible program options, industry-leading education, and up to 180 days of Career Coaching upon graduation, Flatiron School gives our students the jumping-off point they need to begin a rewarding tech career.  

Our programs contain a mixture of lectures, group work, instructor guidance, and community support to both equip our students with technical skills and prepare them to work effectively on a team. No matter what level a student begins at, Flatiron School’s Software Engineering program can take you from a complete beginner to industry-ready in as little as 15 weeks. 

If you’re committed to a career in tech, we’ll teach you the skills you need to succeed. 

Apply today to get started. Not ready to apply? No problem – test out our material with Free Software Engineering Prep Work or download the course syllabus.

If you’re an employer looking to bring new talent to your team, check out our tech training solutions and see how you can invest in your company’s growth. 

Software Engineer Salary by Company in 2023

Software engineering is a career that continues to be in high demand. In fact, the Bureau of Labor Statistics reports that demand for software developers is projected to grow 25% from 2022 to 2032 (much faster than average). In a world so reliant on digital experience, this number comes as no surprise. 

Companies of all sorts turn to software engineers to design, implement, and maintain software systems that address business challenges and client needs. With the right technical and programming expertise, software engineers can succeed while working in any industry

So, if you’re considering a career in the field you may be asking yourself – what can you expect from a Software Engineer’s salary? Below we’ll review how much engineers are paid at some of the top tech companies.

Software Engineer Salaries at 23 Top Tech Companies

Different industries have distinct organizational needs, so it’s no surprise that the average software engineer salary can vary based on the employer. Let’s look at some major corporations and the average software engineer salary* for each one.   

Amazon

You can earn anywhere from $146,000 to $233,000 annually as a software engineer with Amazon. The base pay, which doesn’t account for bonuses, stocks, or commissions, averages $183,000 per year. If you have more experience, you may qualify for a Software Development Engineer II position, which earns between $197,000 and $296,000 per year.

Apple

For a multinational technology company like Apple, software engineering earnings average $145,739 per year. Junior engineers average $86,725 with the company, while those with senior-level experience earn $190,489 on average.

Bloomberg

At Bloomberg, software engineers receive an average of $153,608 annually, which is higher than the national average in this role. An entry-level salary comes in around $62,000.

Capital One

The latest numbers show that Capital One software engineers make an estimated $139,952 annually, with top earners making as much as $205,000. Junior-level engineers can expect annual earnings of $103,000.

Cisco

The average engineering salary for Cisco employees depends on the job grade. Entry-level workers earn around $111,000 per year. Those at the highest level (distinguished engineers) report up to $708,000 in total compensation, which includes stock options and bonuses.

Citadel

Citadel software engineers bring home an average of $116,872 annually. The highest-salaried employees report earning $175,000.

Facebook (Meta)

Those working at Facebook (or Meta, the parent company) can expect an estimated annual salary of $207,000. This number represents the median salary and includes applicable bonuses, stocks, and commissions. As an entry-level worker, you may earn up to $189,000 annually.

Goldman Sachs

Goldman Sachs software engineers receive approximately $119,298 per year. This estimate is almost 20% higher than the average earnings for software engineers across other companies. As a newcomer to the company, you may earn around $85,000, although your location also plays a role in your starting salary. 

Google

A software engineer working for the world’s most-used search engine earns an annual salary of $145,645. On average, entry-level engineers earn $95,793, while senior-level workers receive around $183,952.

IBM

In an entry-level software engineering position at IBM, average annual earnings amount to $92,043. Regardless of position level, the standard pay is $113,412 per year, with the highest earners making as much as $224,000.

Jane Street

Upper-level Jane Street software engineers report earning around $333,000 in total compensation. Those at the lower level earn between $101,000 and $156,000 per year.

Koho

The latest estimates show that Koho software engineers with at least seven years of experience can expect to earn a base salary of $122,263 per year. Those at the senior level make an estimated $138,000 annually

Lockheed Martin

At Lockheed Martin, a software engineer earns an average of $97,883 per year. On the low end, beginners make $93,190 annually, while senior engineers bring home a yearly average of $118,330.

Microsoft

Working for Microsoft offers an average salary of $147,129, although the exact yearly pay depends on experience level.

Netflix

A software engineer employed at Netflix earns an average of $193,626 annually. Exact earnings in 2023 range from $46,000 to $444,000 per year, depending on expertise.

Oracle

Oracle software engineers earn $141,515 per year on average. High-earners report taking home $277,000. At the entry-level, annual earnings still reach $76,105.

Revature

A Revature entry-level software engineer makes an average of $101,000 per year. The reported pay range for this position is $81,000 to $127,000. 

Salesforce

Salesforce software engineer employees report average annual earnings of $120,392. This estimate is about 20% higher than the national average of $100,260. Working in an entry-level position offers an average of $89,000 per year, while the higher-earners in the organization report making around $162,000.

Stytch

The base average software engineering salary at Stytch is $136,647 per year. At the entry-level, you might earn around $126,650 each year, while executive-level or high-end earners report average annual earnings of $145,557. 

Tesla

Tesla software engineers make a little over the national average, with a yearly average salary of $107,322. If you’re starting with little to no experience as an entry-level worker, your earnings may be closer to $81,000. The average salary for engineers with several years of experience is $141,000.

ThousandEyes

How much can you make at ThousandEyes? Broadly speaking, $160,717 is the average salary for software engineers. Depending on a candidate’s background and credentials, this figure can range from $67,000 to $294,000 for senior engineers.

Uber

Uber software engineers earn an average yearly income of $139,952. Beginner earnings come in at around $52,500, while high-level workers earn about $205,000. 

Walmart

A U.S.-based software engineer earns an average annual wage of $105,434 at this global retail corporation. 

Ready To Become A Software Engineer?

The opportunities available to software engineers with an in-demand skillset are varied and increasing. No matter the status of the “traditional” tech industry represented by the companies listed above, every organization – no matter the industry or size – needs an engineer’s capabilities. 

But, to land a software engineering position and start earning like an engineer, you’ll need the right skillset. That’s where Flatiron School comes in. 

Our Software Engineering program takes a holistic approach to learning, teaching both the latest technologies and problem-solving techniques. We’ll teach you how to code, but also how to learn and adapt. That way, no matter what technological revolution comes your way, you’ll be ready. Whether you have zero coding knowledge, are self-taught, or are somewhere in between, this course will take you from foundational skills to industry-ready in as little as 15 weeks.

Apply now to get started on your journey to becoming a Software Engineer. 

*Salaries cited as of 19 October 2023