Join us at BUILD - the Dev Conference for AI and Apps

Register for free to hear the latest product announcements and push the limits of what can be built on Snowflake.

Data Engineering Hands-On Workshop

Phoenix

Bring your laptops, and join us on Nov 14 at 5:00 pm for Intro to Data Engineering with Snowflake Python!

Nov 15, 12:00 – 2:00β€―AM (UTC)

BUILDData Engineering

About this event

Intro to Data Engineering with Snowflake Python

πŸš€ Join Us for a Data Engineering Hands-On Workshop! πŸš€

πŸ—“ Date: November 14th 2024

πŸ•” Time: 5:00 PM

πŸ“ Location: TD SYNNEX

Agenda

5:00-5:30 PM: Networking, Drinks, and Snacks 🍹🍿

5:30-6:00 PM: Walkthrough of Data Engineering Features πŸ› οΈ

6:00-6:45 PM: Hands-On Lab πŸ’»

Prior to the event, make sure you...

REGISTER FOR THE EVENT by 11/11 at the latest! Walk-ins / late registrants will not be granted access to the building.

YOU MUST BRING A FULLY CHARGED LAPTOP TO PARTICIPATE. It is recommended you bring a personal device if there are limitations on your corporate laptop.

Prerequisites

  • Familiarity with Python
  • Familiarity with the DataFrame API
  • Familiarity with Snowflake
  • Familiarity with Git repositories and GitHub

What You'll Need

You will need the following things before beginning:

  • Snowflake account
    • A Snowflake user created with ACCOUNTADMIN permissions. This user will be used to get things setup in Snowflake.
    • Anaconda Terms & Conditions accepted. See Getting Started section in Third-Party Packages.
  • GitHub account
    • If you don't already have a GitHub account you can create one for free. Visit the Join GitHub page to get started.

What You Will Learn:

  • Setup Environment: Use stages and tables to ingest and organize raw data from S3 into Snowflake
  • Snowflake Marketplace: Download the data you need from Snowflake Marketplace and use it for your analysis
  • Data Engineering: Leverage Snowpark for Python DataFrames to perform data transformations such as group by, aggregate, and join to prep for the data for downstream applications
  • Orchestrating Pipelines: Use Snowflake Python Tasks API to turn your data pipeline code into operational pipelines with integrated monitoring

Location:

410 E Rivulon Blvd ste 201, Gilbert, AZ 85295

All guests will be required to show their government issued, photo I.D for entry if necessary

Get ready for an evening of learning, networking, and hands-on practice! See you there! πŸŽ‰

When

When

Friday, November 15, 2024
12:00 AM – 2:00 AM (UTC)

Organizers

  • Jyoti Pathak

    Clipeum.io

    Data Superhero & Phoenix Chapter Leader

  • Esi Nettey

    Chapter Leader

CONTACT US

BUILDData Engineering