Working at Thinking Machines
Thinking Machines is a technology consultancy building AI & data platforms to solve high impact problems for our clients. Our vision is for Southeast Asia to become a global hub for data science. To do that, we create data cultures one organization at a time.
We’re a company made up of intellectually curious, civic-minded, forever-learning individuals. We believe that great data science products are built with care for people, and that the best way to drive inclusive innovation is to start with a diverse team.
Our field of work is incredibly dynamic, so we want to work with people who are committed to growing with us. We want to hire people who can demonstrate an ability to learn, then provide them with personalized coaching, growth opportunities, and a great working environment to get them to world-class.
Analytics Engineers act as the bridge between technology and business. They leverage both technical and business skills to develop scalable data systems that empower end-users to effectively answer their own questions.
As an Analytics Engineer, you will be able to:
- Combine disparate data sources into a form that is actually usable. You will use your engineering skills and understanding of the business use case to transform large amounts of data into useful metrics and tools for analysis.
- Clean, restructure and transform raw data to be consumed by BI tools, dashboards, and other applications. You will work closely with data engineers, business intelligence analysts, software engineers, and more to ensure that analytics seamlessly integrates across large-scale data systems.
- Translate business requirements into data processes and automation. You will collaborate with business users to develop sustainable data models, perform data quality assessments, and design data transformations that enable analytics within their teams.
- Write scalable, efficient, and production-quality data transformation pipelines. You will apply engineering best practices to ensure the quality and maintainability of the data automation processes you develop.
We're looking for someone who meets the following profile:
- Effective communicator - You are able to collaborate with both technical engineers as well as non-technical business users. You know how to adapt your communication style to effectively communicate with the different audiences you will encounter. This extends to proper documentation of data dictionaries, data requirements, manuals, and guidelines.
- Comfortable with coding - You must be very comfortable with writing and explaining code. This ranges from SQL for transformation queries to Python for process automation.
- Quick learning agility - We do not expect you to know everything, but we need you to quickly assimilate knowledge to deliver as you navigate unfamiliar tasks. You can demonstrate this by showing us your capacity to learn independently and problem-solve.
- Systems thinker - You are able to design data models and analytics systems holistically and identify opportunities to make analytics processes quicker and more reliable. Your work will often come in the middle of the entire system, so you need to be aware of how all the different parts affect each other.
- Quantitative critical thinker and leads with curiosity - You ask a lot of the right questions. How would you quantify the data quality? Are there inconsistencies in the data? What use cases can the data be used for? Dig into the raw data to validate it. Fascinated by the automated transformation pipeline? Pick it apart and learn how it works!
- SQL for data analysis and wrangling
- Proficient in at least one programming language
- Familiar with data warehousing tools (e.g. BigQuery, Snowflake)
- Familiar with cloud platforms (e.g. GCP, AWS, Azure)
- Bonus points for candidates with experience in the following:
- Similar data roles (e.g. data engineer, data analyst)
- Python, especially for the purpose of data analysis
- Data Transformation tools (e.g. dbt, Dataform)
- Data Visualization tools (e.g. Google data studio, BigQuery, Tableau) especially for ad-hoc analysis
- Data Pipeline tools (e.g. Airflow, Dagster)
Benefits and Perks
We offer the following compensation and benefits:
Competitive salary — the compensation amount is positively correlated with the difficulty of the job, relevant experience, fit, and skill factors.
Fully remote — due to the global pandemic, we have shifted to a fully remote company for the foreseeable future while we monitor the situation.
Individual professional development budget— an annual budget for conferences, training courses, books, and software is available to sharpen your skills and build new ones to help you grow in your role.
Full health benefits — generous health insurance package upon hiring.