Jeremias Jaymez

I'm a

About

I'm a Software Engineer from Córdoba, Argentina, currently living in The Netherlands.

Passionate about data, working as a Senior Data Engineer at Milence.

Master's degree in Software Engineering (2023), currently pursuing a Master's in Artificial Intelligence.

Data Engineer, DevOps mindset.

Proud founder of Arsysco, unipersonal brand in which I developed web applications for some local clients since 2015.

Currently, I rebranded my site to JJ Data Trending, to start creating content for Data related topics.

  • Working at: Milence
  • Living in: The Netherlands
  • Position: Senior Data Engineer
  • MS Learn: jeremiasjaymez

Resume

Find detailed information about my education, certifications, and work experience on LinkedIn.

Visit LinkedIn

Data Engineering

Started a GitHub repository to contribute and share knowledge with the community.

Click and go to jeremiasjaymez/DataEngineering

Background

I've worked on different roles related to data and development, which gave me the chance to have a more complet set of tools to help my team to grow.

Data

Currently working as a Senior Data Engineer, mainly experienced in Azure and Databricks.
For my Master's thesis and some projects on my own, I applied Machine Learning.
When I started working, my role was BI Developer/Analyst, in which I've done tasks like: modelling DWH, creating ETL processes and dashboards.

DevOps Engineer

In the last two companies, I did DevOps and DataOps related tasks too.
Manly deploying Infrastructure as Code with different tools, updating pipelines to be reliable, improving CICD processes.
Finally, I'm experienced in Databricks Administration by securing different components, deploying Workflows, configuring policies, among other tasks.

Software Engineer

Worked as a Software Engineer in freelance projects, for mid-sized companies.
Experience in .NetCore C# as a Full Stack developer.

Skills

This section is to summarize the skills I got from the 3 main different positions I had during my career.

  • All
  • Data
  • DevOps
  • Developer

ETL

Databricks, Synapse, ADF, MicroStrategy, Pentaho.

Web Apps.

MVC, .NetCore, C#, Bootstrap.

Functions

Azure Serverless Functions.

Modelling

MicroStrategy, Diagrams.

Visualization

PowerBI, MicroStrategy.

IaC

Azure Pipelines with Bicep, Bash, PowerShell, Terraform.

SQL

Advanced SQL.

Python

Python, PySpark.

CI/CD

Azure build and release pipelines.

Challenges

I've worked on different roles related to data and development, which gave me the chance to have a more complet set of tools to help my team to grow.

Secure Platform

Working for a financial institution, I encountered the challenge of developing a highly secure platform for Data Scientists and Analysts to perform various investigations.
The main objective was to create a workstation capable of providing robust computational power and software programs to efficiently process vast datasets while ensuring security and compliance.
To address this challenge, our team developed a secure workstation that offered the necessary compute power, optimized performance, and regulatory compliance.
This involved comprehensive work on networking, storage, compute resources, data ingestion, availability, and software maintenance.
Ensuring compliance with regulations was a must to safeguarding personal information, and optimizing performance was critical given the large amount of data being requested.
In addition to my technical contributions towards maintenance and the development of new features, I played an important role in project organization.
This included participating in discussions, defining next steps, and aligning efforts with architects, users, and the Product Owner.

Accurate Data Delivery

I worked for a consultancy company as a contractor for a USA non-profit organization.
Focused on helping first-time moms living in poverty within the first 100 days of life of their babies.
The challenge was to enable the organization to create reports to measure the impact on society made by the nurses and the supporting staff, with the funds provided by philanthropic foundations and the US Government.
As a team, we were responsible for providing accurate data by cleaning it through specific business rules. The inconsistencies were reported through dashboards.
To address this, we created business rules to help the data transformation process by identifying inconsistencies.
To simplify the business rules validation, we implemented a user-friendly web application that encapsulated the complexity and displayed data changes through the three different applied layers (bronze, silver, gold).
HIPAA compliance was required to ensure the privacy and security of Protected Health Information (PHI), which includes any information that can be connected to an individual's health condition.
Given that the data was manually entered into the core applications, special effort was needed in the transformation process.

Last Blog Entries

Latest thoughts and insights about technology, data engineering, and AI.

Data Engineering Best Practices
Data Engineering Best Practices

First content, available soon! Stay tuned!

Python Logging
Python Logging

Coming soon!

Contact

Let's connect! You can find me on these platforms:

I'm always interested in new opportunities and collaborations. Feel free to reach out through any of these channels!