Write your awesome label here.

πŸŽ“ Certification Included πŸŽ“ 

Upon completion of project material

Prerequisites:

🀩 Even if you lack experience in any of the areas below, this project provides the necessary resources to learn and fill the gap.

  • Python programming skills: Basic to intermediate knowledge of Python.

  • SQL and relational databases: Basic familiarity with SQL and how relational databases work will help you understand data storage and querying processes.

  • Familiarity with web technologies: Understanding of HTML, CSS, and JavaScript basics to navigate and scrape web data.

  • Google Cloud Platform account: Set up a free trial on Google Cloud Platform (GCP) to access various services used in the project.

⏰ Estimated Completion Time:

  • For Beginners (1+ year of programming experience with Python): 2-3 weeks.                 

  • For Experienced Data/Software Professionals (1+ years of professional experience with Python or having a CS degree): 2-3 days.

Data & Cloud Engineering Project

Data Engineering Project

Automated Web Scraper for Python Job Listings on Google Cloud Platform

Introduction to GCP: End-to-End Weather API Data Pipeline

Build and deploy an automated web scraper on Google Cloud Platform to gather job listings efficiently. This project equips you with the skills to scrape, process, and store data using advanced cloud technologies.

🎯 Professionals typically need at least three months to acquire the skills that this project can teach you in two weeks!
Build a modern data pipeline on Google Cloud Platform from scratch.
Write your awesome label here.

What you will learn:

What you will learn:

  • Scrape data dynamically using Selenium: Automate the extraction of data from websites without API access.
  • Handle web elements and manage data extraction: Learn to interact with web pages programmatically to extract necessary data elements.

  • Deploy and schedule web scraping tasks: Use Google Cloud Run and Cloud Scheduler to automate and manage your scraping tasks without manual intervention.
  • Store and analyze scraped data: Utilize Google BigQuery to handle large datasets and perform complex queries quickly.
  • Prepare for interviews: Gain valuable insights on presenting this project to potential employers effectively.
  • Extract data from a Public API
  • Transform data with Python

  • Load data into a Data Warehouse
  • Deploy Project on Google Cloud Platform, using services such as: GCP Cloud Storage, Cloud Functions, BigQuery, Cloud Scheduler, IAM
  • Create Interactive Dashboards with Looker Studio
  • Interview Preparation to present this project to recruiters

πŸŽ“ Certification Included πŸŽ“ 

Upon completion of project material
Prerequisites:
  • Python experience
  • Set up a free trial on Google Cloud Platform (GCP)

Write your awesome label here.

πŸŽ“ Certification Included πŸŽ“ 

Upon completion of project material

Prerequisites:

🀩 Even if you lack experience in any of the areas below, this project provides the necessary resources to learn and fill the gap.

  • Python programming skills: Basic to intermediate knowledge of Python.

  • SQL and relational databases: Basic familiarity with SQL and how relational databases work will help you understand data storage and querying processes.

  • Familiarity with web technologies: Understanding of HTML, CSS, and JavaScript basics to navigate and scrape web data.

  • Google Cloud Platform account: Set up a free trial on Google Cloud Platform (GCP) to access various services used in the project.

⏰ Estimated Completion Time:

  • For Beginners (1+ year of programming experience with Python): 2-3 weeks.                 

  • For Experienced Data/Software Professionals (1+ years of professional experience with Python or having a CS degree): 2-3 days.

Data & Cloud Engineering Project

Data Engineering Project

Automated Web Scraper for Python Job Listings on Google Cloud Platform

Introduction to GCP: End-to-End Weather API Data Pipeline

Build and deploy an automated web scraper on Google Cloud Platform to gather job listings efficiently. This project equips you with the skills to scrape, process, and store data using advanced cloud technologies.

🎯 Professionals typically need at least three months to acquire the skills that this project can teach you in two weeks!
Build a modern data pipeline on Google Cloud Platform from scratch.
Write your awesome label here.

What you will learn:

What you will learn:

  • Scrape data dynamically using Selenium: Automate the extraction of data from websites without API access.
  • Handle web elements and manage data extraction: Learn to interact with web pages programmatically to extract necessary data elements.

  • Deploy and schedule web scraping tasks: Use Google Cloud Run and Cloud Scheduler to automate and manage your scraping tasks without manual intervention.
  • Store and analyze scraped data: Utilize Google BigQuery to handle large datasets and perform complex queries quickly.
  • Prepare for interviews: Gain valuable insights on presenting this project to potential employers effectively.
  • Extract data from a Public API
  • Transform data with Python

  • Load data into a Data Warehouse
  • Deploy Project on Google Cloud Platform, using services such as: GCP Cloud Storage, Cloud Functions, BigQuery, Cloud Scheduler, IAM
  • Create Interactive Dashboards with Looker Studio
  • Interview Preparation to present this project to recruiters

πŸŽ“ Certification Included πŸŽ“ 

Upon completion of project material
Prerequisites:
  • Python experience
  • Set up a free trial on Google Cloud Platform (GCP)

Data & Cloud Engineering Project

Automated Web Scraper for Python Job Listings on Google Cloud Platform

Build and deploy an automated web scraper on Google Cloud Platform to gather job listings efficiently. This project equips you with the skills to scrape, process, and store data using advanced cloud technologies.

🎯 Professionals typically need at least three months to acquire the skills that this project can teach you in two weeks!
Write your awesome label here.

What you will learn:

  • Scrape data dynamically using Selenium: Automate the extraction of data from websites without API access.
  • Handle web elements and manage data extraction: Learn to interact with web pages programmatically to extract necessary data elements.

  • Deploy and schedule web scraping tasks: Use Google Cloud Run and Cloud Scheduler to automate and manage your scraping tasks without manual intervention.
  • Store and analyze scraped data: Utilize Google BigQuery to handle large datasets and perform complex queries quickly.
  • Prepare for interviews: Gain valuable insights on presenting this project to potential employers effectively.

Prerequisites:

🀩 Even if you lack experience in any of the areas below, this project provides the necessary resources to learn and fill the gap.

  • Python programming skills: Basic to intermediate knowledge of Python.                   

  • SQL and relational databases: Basic familiarity with SQL and how relational databases work will help you understand data storage and querying processes.

  • Familiarity with web technologies: Understanding of HTML, CSS, and JavaScript basics to navigate and scrape web data.

  • Google Cloud Platform account: Set up a free trial on Google Cloud Platform (GCP) to access various services used in the project.

⏰ Estimated Completion Time:

  • For Beginners (1+ year of programming experience with Python): 2-3 weeks.              

  • For Experienced Data/Software Professionals (1+ years of professional experience with Python or having a CS degree): 2-3 days.

πŸŽ“ Certification Included πŸŽ“ 

Upon completion of project material!

Data & Cloud Engineering Project

Automated Web Scraper for Python Job Listings on Google Cloud Platform

Build and deploy an automated web scraper on Google Cloud Platform to gather job listings efficiently. This project equips you with the skills to scrape, process, and store data using advanced cloud technologies.

🎯 Professionals typically need at least three months to acquire the skills that this project can teach you in two weeks!
Write your awesome label here.

What you will learn:

  • Scrape data dynamically using Selenium: Automate the extraction of data from websites without API access.
  • Handle web elements and manage data extraction: Learn to interact with web pages programmatically to extract necessary data elements.

  • Deploy and schedule web scraping tasks: Use Google Cloud Run and Cloud Scheduler to automate and manage your scraping tasks without manual intervention.
  • Store and analyze scraped data: Utilize Google BigQuery to handle large datasets and perform complex queries quickly.
  • Prepare for interviews: Gain valuable insights on presenting this project to potential employers effectively.

Prerequisites:

🀩 Even if you lack experience in any of the areas below, this project provides the necessary resources to learn and fill the gap.

  • Python programming skills: Basic to intermediate knowledge of Python.

  • SQL and relational databases: Basic familiarity with SQL and how relational databases work will help you understand data storage and querying processes.

  • Familiarity with web technologies: Understanding of HTML, CSS, and JavaScript basics to navigate and scrape web data.

  • Google Cloud Platform account: Set up a free trial on Google Cloud Platform (GCP) to access various services used in the project.

⏰ Estimated Completion Time:

  • For Beginners (1+ year of programming experience with Python): 2-3 weeks.

  • For Experienced Data/Software Professionals (1+ years of professional experience with Python or having a CS degree): 2-3 days.

πŸŽ“ Certification Included πŸŽ“ 

Upon completion of project material!

Data & Cloud Engineering Project

Automated Web Scraper for Python Job Listings on Google Cloud Platform

Build and deploy an automated web scraper on Google Cloud Platform to gather job listings efficiently. This project equips you with the skills to scrape, process, and store data using advanced cloud technologies.

🎯 Professionals typically need at least three months to acquire the skills that this project can teach you in two weeks!
Write your awesome label here.

What you will learn:

  • Scrape data dynamically using Selenium: Automate the extraction of data from websites without API access.
  • Handle web elements and manage data extraction: Learn to interact with web pages programmatically to extract necessary data elements.

  • Deploy and schedule web scraping tasks: Use Google Cloud Run and Cloud Scheduler to automate and manage your scraping tasks without manual intervention.
  • Store and analyze scraped data: Utilize Google BigQuery to handle large datasets and perform complex queries quickly.
  • Prepare for interviews: Gain valuable insights on presenting this project to potential employers effectively.

Prerequisites:

🀩 Even if you lack experience in any of the areas below, this project provides the necessary resources to learn and fill the gap.

  • Python programming skills: Basic to intermediate knowledge of Python.                   

  • SQL and relational databases: Basic familiarity with SQL and how relational databases work will help you understand data storage and querying processes.

  • Familiarity with web technologies: Understanding of HTML, CSS, and JavaScript basics to navigate and scrape web data.

  • Google Cloud Platform account: Set up a free trial on Google Cloud Platform (GCP) to access various services used in the project.

⏰ Estimated Completion Time:

  • For Beginners (1+ year of programming experience with Python): 2-3 weeks.             

  • For Experienced Data/Software Professionals (1+ years of professional experience with Python or having a CS degree): 2-3 days.

πŸŽ“ Certification Included πŸŽ“ 


Upon completion of project material!


πŸ’­ Who is this project for?

This project is perfect for many tech professionals, from beginners to experienced engineers looking to explore data and cloud technologies. Whether you're looking to add impressive projects to your portfolio or starting your first project in the cloud, this project is designed just for you:

#1 Aspiring 
Data Engineers & Python Developers

If you're starting your tech career or enhancing your programming skills, this project offers a standout opportunity. Most beginners don't know how to deploy web scrapers in a production environment. By building and managing a production-grade web scraper on Google Cloud, you'll distinguish yourself from other candidates. This practical, scalable project experience can help you land your first full-time job.

#2 Machine Learning and Software Engineers

For Machine Learning and Software Engineers, this project is an excellent chance to expand your expertise into data collection and cloud infrastructure management. It's perfect for professionals eager to learn about data pipelines for machine learning models or those interested in automating web data interactions. This knowledge is crucial for integrating robust data handling into your development processes.

#3 Experienced Cloud and BI Engineers

If you're proficient in cloud technologies or business intelligence, this project will broaden your capabilities by introducing advanced data processing and automation in Google Cloud. Learning to implement automated web scraping and task scheduling with Cloud Run and Cloud Scheduler will enhance your role and pave the way for new career prospects.

#4 Career Switchers to Tech

If you're transitioning into tech from fields like finance, marketing, or science, this project is your gateway. It offers hands-on experience with coding, data handling, and cloud deployment, equipping you with the skills needed for tech roles. This project ensures you gain the confidence to tackle technical tasks and discussions, positioning you well for roles in data and cloud engineering.

πŸ’­ Who is this project for?

This project is perfect for many tech professionals, from beginners to experienced engineers looking to explore data and cloud technologies. Whether you're looking to add impressive projects to your portfolio or starting your first project in the cloud, this project is designed just for you:

#1 Aspiring Data Engineers & Python Developers

If you're starting your tech career or enhancing your programming skills, this project offers a standout opportunity. Most beginners don't know how to deploy web scrapers in a production environment. By building and managing a production-grade web scraper on Google Cloud, you'll distinguish yourself from other candidates. This practical, scalable project experience can help you land your first full-time job.

#2 Machine Learning &
Software Engineers

For Machine Learning and Software Engineers, this project is an excellent chance to expand your expertise into data collection and cloud infrastructure management. It's perfect for professionals eager to learn about data pipelines for machine learning models or those interested in automating web data interactions. This knowledge is crucial for integrating robust data handling into your development processes.

#3 Experienced Cloud & BI Engineers

If you're proficient in cloud technologies or business intelligence, this project will broaden your capabilities by introducing advanced data processing and automation in Google Cloud. Learning to implement automated web scraping and task scheduling with Cloud Run and Cloud Scheduler will enhance your role and pave the way for new career prospects.

#4 Career Switchers to Tech

If you're transitioning into tech from fields like finance, marketing, or science, this project is your gateway. It offers hands-on experience with coding, data handling, and cloud deployment, equipping you with the skills needed for tech roles. This project ensures you gain the confidence to tackle technical tasks and discussions, positioning you well for roles in data and cloud engineering.

 What's included? 

This is not just a project; it's a complete package that contains everything you need to build the project successfully and master the given skills and technologies. Specifically, it offers:

πŸ’Ž Real-World Portfolio Project:

"Automated Web Scraper for Python Job Listings on Google Cloud Platform"

We provide detailed, step-by-step instructions on how to start, build, and deploy your project from scratch. You don't need to worry about a thing. Every step of the process is clearly documented in an easy-to-read format. We provide both code and instructions because we want you to concentrate on the project's core value: "Learning how to combine cloud technologies to build a high-value system that can gather data from any website"!

By implementing this project, you'll gain experience in high-demand cloud technologies and skills while learning how to build a production-ready project. You'll be able to add a unique portfolio project to your resume, setting you apart from other candidates and boosting your confidence for your interviews to secure your next high-paying job in data!

🌟 By the end of the project, you will have developed and deployed an automated web scraper to gather Python job listings daily using modern cloud-based technologies.

🎁 Bonuses Included with Every Enrollment:

We offer much more than just a project. You'll engage with various technologies and tools, and to help you navigate them, we provide tailored mini-courses. These courses cover a range of essential topics, from Python and Google Cloud fundamentals to setting up a free Google Cloud Account and using Data Warehouses such as BigQuery. Everything you need to know is included.

Furthermore, we offer mini-courses on Git and GitHub, complete with detailed instructions on creating a GitHub repository and uploading your code. This visibility can attract recruiters and companies.

We also guide you in sharing your project on LinkedIn to secure your next interviews. Our support extends to offering advice on presenting the project in interviews and responding to common interview questions.

Finally, you'll receive access to our active community of 40+ individuals working on similar projects. Here, you can ask questions, get answers, and receive support from other members and the instructor. We pride ourselves on our excellent community support, as noted by our users.      

😍 Detailed Description of Your Bonuses:

1. πŸ‘¨β€πŸ’» Best free resources to learn Python and Pandas if you’re not familiar already
2. βš™οΈ Local Environment Setup: How to install Python, VSCode and Anaconda (if you haven’t done already)
3. πŸ“šMini-Courses & extra lessons to cover the following:
         - Web Scraping Fundamentals
         - Selenium Fundamentals
         - Introduction Google Cloud Platform
         - How to Set Up Your Google Cloud Platform Free Trial Account
         - Security and Access Permissions On Google Cloud Platform
         - BigQuery and Data Warehousing Fundamentals
         - Docker Fundamentals
         - Cloud Run Fundamentals
         - Introduction to Git & GitHub
4. πŸ“‚ Instructions and well-structured readme file to deploy your project on Github
5. 🌐 Networking Guidelines to share your Project on LinkedIn and attract recruiters
6. πŸ‘” Interview Preparation with common interview question on your project
7. πŸ“ Final Assesment to test your acquired knowledge
8. πŸ…Certificate of completion to share with your network
9. 🌍 Access to our Slack Community 🌍 where we communicate daily to help each other on the projects and job searching
😊 Don't miss this offer!

 What's included? 

This is not just a project; it's a complete package that contains everything you need to build the project successfully and master the given skills and technologies. Specifically, it offers:

πŸ’Ž Real-World Portfolio Project:

"Automated Web Scraper for Python Job Listings on Google Cloud Platform"


We provide detailed, step-by-step instructions on how to start, build, and deploy your project from scratch. You don't need to worry about a thing. Every step of the process is clearly documented in an easy-to-read format. We provide both code and instructions because we want you to concentrate on the project's core value: "Learning how to combine cloud technologies to build a high-value system that can gather data from any website"!

By implementing this project, you'll gain experience in high-demand cloud technologies and skills while learning how to build a production-ready project. You'll be able to add a unique portfolio project to your resume, setting you apart from other candidates and boosting your confidence for your interviews to secure your next high-paying job in data!

🌟 By the end of the project, you will have developed and deployed an automated web scraper to gather Python job listings daily using modern cloud-based technologies.

🎁 Bonuses Included with Every Enrollment:

We offer much more than just a project. You'll engage with various technologies and tools, and to help you navigate them, we provide tailored mini-courses. These courses cover a range of essential topics, from Python and Google Cloud fundamentals to setting up a free Google Cloud Account and using Data Warehouses such as BigQuery. Everything you need to know is included.

Furthermore, we offer mini-courses on Git and GitHub, complete with detailed instructions on creating a GitHub repository and uploading your code. This visibility can attract recruiters and companies.

We also guide you in sharing your project on LinkedIn to secure your next interviews. Our support extends to offering advice on presenting the project in interviews and responding to common interview questions.

Finally, you'll receive access to our active community of 40+ individuals working on similar projects. Here, you can ask questions, get answers, and receive support from other members and the instructor. We pride ourselves on our excellent community support, as noted by our users.      

😍 Detailed Description of Your Bonuses:

1. πŸ‘¨β€πŸ’» Best free resources to learn Python and Pandas if you’re not familiar already
2. βš™οΈ Local Environment Setup: How to install Python, VSCode and Anaconda (if you haven’t done already)
3. πŸ“šMini-Courses & extra lessons to cover the following:
- Web Scraping Fundamentals
- Selenium Fundamentals
- Introduction Google Cloud Platform
- How to Set Up Your Google Cloud Platform Free Trial Account
- Security and Access Permissions On Google Cloud Platform
- BigQuery and Data Warehousing Fundamentals
- Docker Fundamentals
- Cloud Run Fundamentals
- Introduction to Git & GitHub

4. πŸ“‚ Instructions and well-structured readme file to deploy your project on Github
5. 🌐 Networking Guidelines to share your Project on LinkedIn and attract recruiters
6. πŸ‘” Interview Preparation with common interview question on your project
7. πŸ“ Final Assesment to test your acquired knowledge
8. πŸ…Certificate of completion to share with your network
9. 🌍 Access to our Slack Community 🌍 where we communicate daily to help each other on the projects and job searching
😊 Don't miss this offer!

What Our Students Say β€οΈ

What Our Students Say β€οΈ

WHAT YOU ARE GOING TO BUILD

Project architecture diagram:

Project Workflow 

Here is the outline of the process you will follow to complete the project:
1. Getting Started:
Define the project, establish requirements, and devise an implementation strategy. Learn the essential steps for setting up and starting the project.

2.Understanding Web Scraping:
Delve into the fundamentals of web scraping, exploring best practices and common pitfalls using a case study of fake Python job postings.

3. Mini-Course on Selenium:
Master the basics of Selenium for web scraping, setting the stage for practical applications.

4. Web Scraping Implementation:
Apply your Selenium skills to scrape Python job listings, demonstrating the capability to collect and manage web data effectively.

5. Google Cloud Platform Introduction:
Introduce Google Cloud Platform, focusing on setting up a free trial account and understanding its core functionalities.
6. Data Storage on BigQuery:
 Learn to store scraped data into BigQuery, exploring how to organize data for efficient analysis and accessibility.

7. Mini-Course on Docker: Understand how to containerize Python scripts using Docker, essential for scalable deployments.

8. Cloud Run Deployment:
Deploy the scraping script on Google Cloud Run, configuring it for reliable, scalable operations.

9. Automation with Cloud Scheduler: 
Set up Cloud Scheduler to automate the scraping process, ensuring daily data updates without manual intervention.

10. Project Wrap-Up:
Review final steps and instructions for winding down the Google Cloud Platform Free Trial Account properly.

11. Portfolio Enhancement & Interview Prep:
Learn how to upload your project to GitHub to showcase your technical skills to potential employers and prepare for common interview questions related to your project

Take your career to the next level
with better job opportunities and skills!

😍 Limited time offer!

          Project Material         

We provide different types of content to serve learning purposes in the most efficient manner.

πŸ“ Educational Text Material

We provide comprehensive step-by-step tutorials designed to guide you through building and deploying your project on the cloud. You'll receive detailed instructions for uploading your project to GitHub and gain access to common interview questions and answers related to the project. These resources are tailored to enhance your understanding and boost your job preparation effectively.

▢️ Comprehensive Video Lectures

For a more immersive learning experience, we offer video tutorials whenever needed. For instance, you can learn to build an interactive dashboard by closely following our step-by-step video lessons, making complex concepts easier to grasp and apply.

πŸ’― Assessments & Quizes

Validate your grasp of the technologies and their strategic application with carefully designed assessments and quizzes integrated within the course structure.

πŸŽ“ Certificate of Completion

Upon successful completion of the course, you will be awarded a professional Certificate of Completion. Showcase your accomplishment on platforms like LinkedIn and other social networks.

         Project Material        

We provide different types of content to serve learning purposes in the most efficient manner.

πŸ“ Educational Text Material

We provide comprehensive step-by-step tutorials designed to guide you through building and deploying your project on the cloud. You'll receive detailed instructions for uploading your project to GitHub and gain access to common interview questions and answers related to the project. These resources are tailored to enhance your understanding and boost your job preparation effectively.

▢️ Comprehensive Video Lectures

For a more immersive learning experience, we offer video tutorials whenever needed. For instance, you can learn to build an interactive dashboard by closely following our step-by-step video lessons, making complex concepts easier to grasp and apply.

πŸ’― Assessments & Quizes

Validate your grasp of the technologies and their strategic application with carefully designed assessments and quizzes integrated within the course structure.

πŸŽ“ Certificate of Completion

Upon successful completion of the course, you will be awarded a professional Certificate of Completion. Showcase your accomplishment on platforms like LinkedIn and other social networks.

YOUR INSTRUCTOR

Mike Chionidis

Freelance Data & AI Engineer, specialized in Leading Teams and Building Products
About me

πŸ‘‹ I'm Mike, a Freelance Data & AI Engineer based in Greece. 


Throughout my career, I've worked with prominent clients like Publicis Groupe and delivered data products utilized by industry giants such as as Samsung, Three Mobile, and Western Union, and others. Within just two years of professional experience, I advanced to a Team Lead role, guiding a team of data professionals.

Previously, I've taught programming to university students to help them excel in their exams and assisted junior developers in kickstarting their careers.

My passion for sharing knowledge led to the establishment of DataProjects, with a clear purpose to help data enthusiasts secure their dream roles in the field. 

Let's embark on this learning adventure together, as we delve into the exciting world of data! πŸ’« 
YOUR INSTRUCTOR

Mike Chionidis

Freelance Data & AI Engineer, specialized in Leading Teams and Building Products
About me
πŸ‘‹ I'm Mike, a Freelance Data & AI Engineer based in Greece. 

Throughout my career, I've worked with prominent clients like Publicis Groupe and delivered data products utilized by industry giants such as as Samsung, Three Mobile, and Western Union, and others. Within just two years of professional experience, I advanced to a Team Lead role, guiding a team of data professionals.

Previously, I've taught programming to university students to help them excel in their exams and assisted junior developers in kickstarting their careers.

My passion for sharing knowledge led to the establishment of DataProjects, with a clear purpose to help data enthusiasts secure their dream roles in the field. 

Let's embark on this learning adventure together, as we delve into the exciting world of data! πŸ’« 

Frequently asked questions

1. How much Python and SQL knowledge is required?

A fundamental proficiency in Python, specifically with the Pandas library, is necessary. You should have experience working with Pandas DataFrames. In terms of SQL, a solid understanding of fundamental statements such as Select, From, Where, and Join, as well as familiarity with SQL tables, is recommended.

2. Will I need to subscribe to any Cloud services?

No, you won't need to pay for any Cloud subscriptions. Google Cloud offers all new users a $300 credit to use within their first 90 days. Simply create an account with your credit card, activate the trial, and follow the provided instructions to deactivate automated renewal, as demonstrated in the project videos.

3. How long will I have access to the project?

You will have access to the project and its materials for the duration of your subscription. This includes all future updates related to the project during your subscription period.

4. Do you offer trainings for professionals/businesses?

Yes, based on request, we offer professional training on Data/Cloud technologies. For inquiries regarding professional trainings, please reach out to us at info@dataprojects.io with your specific details, or using our Contact Form.

5. Have more questions?

For any additional inquiries or clarifications, feel free to contact us at info@dataprojects.io or using our Contact Form.

Join our newsletter!

Get updates on new projects, our weekly blog with valuable content on data and cloud topics, big sales and more!
Thank you!
Created with