Data Analyst InternAt — Bauhaus Group

Opportunity Summary 

As a Data Analyst Intern, key responsibilities will include:

• Assist in collecting, analyzing, and interpreting data from various sources.

• Support the development of reports and dashboards to track key performance indicators (KPIs) and business metrics.

• Conduct data cleansing and validation to ensure data accuracy and consistency.

• Assist in identifying trends, patterns, and insights from data to inform business strategies.

• Help prepare presentations and visualizations to communicate findings to stakeholders.

• Participate in team meetings and contribute ideas to enhance data analysis processes.

• Perform ad-hoc analysis and support special projects as needed.

Tasks:

Assist in collecting, analyzing, and interpreting data from various sources.
• Support the development of reports and dashboards to track key performance indicators (KPIs) and business metrics.
• Conduct data cleansing and validation to ensure data accuracy and consistency.
• Assist in identifying trends, patterns, and insights from data to inform business strategies.
• Help prepare presentations and visualizations to communicate findings to stakeholders.
• Participate in team meetings and contribute ideas to enhance data analysis processes.
• Perform ad-hoc analysis and support special projects as needed.

Training:

Initial General Employer Orientation (Mandatory):
Familiarization with company culture, policies, tools (JIRA, Tableau, GCP, data warehousing solutions).
Duration: 1 day
Weekly Scheduled One-on-One Mentor Meetings:
Personalized guidance, feedback, and goal setting, focusing on ETL processes, data modeling, and pipeline optimization.
Duration: 3 hours per week
Job Shadowing:
Practical exposure to designing and implementing ETL pipelines, data extraction, transformation, and storage optimization.
Duration: 2 weeks (daily 2 hours)
Workshops/Skills Training:
Enhancing technical skills in R, SQL, Tableau, ETL design, and cloud platforms (GCP).
Duration: 4 weeks (weekly 2-hour sessions)
Provision of Work Samples:
Access to previous projects and tasks for reference, including Tableau dashboards, and Python scripts.
Duration: Ongoing (as needed)
Overview/Contextualization of Assigned Tasks:
Detailed briefing on tasks, including objectives, expected outcomes, and relevance to data engineering projects.
Duration: Before the start of each new task

Learning Outcome:

Proficiency in Data Pipeline Development:
Successfully design, develop, and maintain efficient ETL (Extract, Transform, Load) data pipelines that handle large volumes of structured and unstructured data.
Demonstrate the ability to integrate data from multiple sources such as APIs, relational databases, and flat files into a cohesive and accessible format.
Enhanced Data Analysis and Visualization Skills:
Gain hands-on experience in data cleaning, transformation, and aggregation using R, SQL, and other data manipulation tools, leading to the preparation of high-quality datasets.
Develop and maintain interactive dashboards and reports using Tableau and other BI (Business Intelligence) tools, providing valuable insights for data-driven decision-making.
Agile Project Management Experience:
Actively participate in Agile Scrum processes, including sprint planning, daily stand-ups, sprint reviews, and retrospectives.
Utilize JIRA for effective task management, tracking project progress, and generating sprint reports, ensuring timely completion of deliverables.
Improved Collaboration and Communication Skills:
Collaborate effectively with cross-functional teams, including data scientists, data engineers, software developers, and business stakeholders, to gather and process data requirements.
Enhance communication skills by creating comprehensive documentation for data workflows, ETL processes, and data governance protocols.
Knowledge of Data Quality and Governance:
Assist in the implementation of data quality and governance standards, ensuring the accuracy, consistency, and reliability of data through data profiling, validation, and monitoring.
Develop a strong understanding of data warehousing concepts such as OLAP (Online Analytical Processing), OLTP (Online Transaction Processing), and data lakes.
Technical Skill Advancement:
Strengthen proficiency in R for data manipulation, statistical analysis, and visualization, along with a foundational understanding of SQL for database querying and management.
Gain exposure to cloud platforms such as AWS (Amazon Web Services) or Google Cloud Platform (GCP) and additional programming languages like Python for data engineering tasks, broadening technical expertise.

Program 
Academic Internship
Location Type 
Hybrid (combination of on-site and remote)
Location 
Carson, California
United States
This opportunity provides some form of compensation 
No
Opportunity Availability 
08/07/2024 to 12/12/2024