Senior Data Engineer - Remote

16 days ago
Full time role


kWh Analytics is insuring the Energy Transition by leveraging the most comprehensive performance database of solar assets in the United ...

View Company Profile

Job Description

About us
kWh Analytics is a leading provider of Climate Insurance for zero carbon assets. Utilizing their proprietary database of over 300,000 operating renewable energy assets, kWh Analytics uses real-world project performance data and decades of expertise to underwrite unique risk transfer products on behalf of insurance partners. kWh Analytics has recently been recognized on FinTech Global’s ESGFinTech100 list for their data and climate insurance innovations.
The Solar Revenue Put production insurance protects against downside risk and unlocks preferred financing terms, and the Renewable Energy Property Product offers comprehensive coverage against physical loss. These offerings, which have insured over $4 billion of assets to date, aim to further kWh Analytics’ mission to provide best-in-class Insurance for our Climate.
Who we're looking for:
Everything kWh does depends on the quality and quantity of solar data in our industry leading database. You’ll be in charge of parsing operational and financial data from remote APIs and client data file dumps, in a variety of formats, and integrating it into the database.

About you:

  • You have a Bachelor’s degree in Computer Science or equivalent real-world experience.
  • At least 5 years of professional experience using Python and SQL.
  • You have the curiosity and drive to constantly learn new languages, skills, and tools.
  • You thrive in a fast-paced, dynamic environment as kWh Analytics adapts to a rapidly growing and changing solar industry.
  • If you aren’t already familiar with the solar industry, you should be excited to learn about it.
  • You’re dedicated to quality, craftsmanship, truth and accuracy in reporting.

What you can look forward to:

  • Writing code to parse and clean data from input files and remote APIs.
  • Optimizing database queries for good performance on very large data sets.
  • Helping design database schemas, or redesigning them to meet changing needs.
  • Working closely with teammates to improve and optimize the data pipeline.
  • Working across teams to understand their data needs and building new products to meet them
  • Actively participating in code reviews.
  • Learning from your teammates and sharing your knowledge and ideas.
  • An equity stake in the company, via incentive stock options.
  • A wide variety of medical, dental, and vision plans. 401(k), HSA, FSA and corporate discounts.

Required skills:

  • Excellent Python skills
  • Fluency with SQL (We use Postgres, but other flavors are acceptable)
  • Experience with Amazon Web Services (e.g. S3, Lambda, API Gateway, RDS, DynamoDB)
  • Fluency with Linux server command line environment
  • Experience with Docker
  • Experience with CI/CD tools
  • Experience designing data pipelines

Nice to have:

  • Experience working with photovoltaics (PV), wind, or battery data
  • Experience using Pandas
  • Experience using ORMs and schema migration scripts (we use SQLAlchemy/Alembic)
  • Experience parsing and cleaning gnarly data sets, and converting data to/from CSV, JSON, YAML, etc.
  • Experience with Data Orchestration Tools (e.g. Airflow, DBT)
  • Experience with Kubernetes
  • Experience building Data Lakes and related technology/platforms
  • Domain knowledge of energy, project finance, or insurance
  • Experience leading junior team members in large, multi-phase projects

Learn more about what we do and why it matters:
kWh Analytics is an equal opportunity employer. We celebrate diversity and are committed to maintaining an inclusive environment for all employees.

kWh Analytics is insuring the Energy Transition by leveraging the most comprehensive performance database of solar assets in the United ...

View Company Profile