Discover our Tech Team
Get to know what our tech team has to offer
At travel audience we have one of the most dynamic and engaged tech teams; our engineers, analysts, and scientists tackle complex challenges while taking responsibility and ownership from day one. Our tech leadership boasts an average of over 10 years of experience and has notable achievements across industries in diverse companies varying from early startups to established enterprises.
Our tech and data teams are at the core of our company as they are responsible for driving innovation and managing our proprietary technology and databases. The engineering teams work closely together and lean on one another on a daily basis, also partaking in knowledge sharing activities. The team works with one of the most modern and diverse tech stacks in the industry and led by senior tech leadership with over 10 years in the space.
Our current challenges include scaling up our technology to meet the needs of our clients and those of the ever-evolving travel industry. We believe in acceleration through agility; we empower our tech team to follow agile principles and embrace a customer-oriented and responsive working style.
Big data & Real time
Our data team deals with tens of thousands of events per second and terabytes of data per day. Therefore, there are always new ways to leverage our data and new challenges to be solved requiring an agile approach to architecture and the desire to embrace new technologies whether it’s open-source or cloud-based services.
Our backend crew who handles high throughput, low latency, hyper resilient software for realtime bidding and ad delivery
Our frontend and backend engineers developing the advanced graphical tooling for administrating ad campaigns
Our SRE team that is operating our Cloud Environment, Applications, GitOps CI/CD pipelines - using modern, state-of-the-art technologies
IT Service desk
Our team managing the IT systems, tools, and processes that enable our employees to perform their jobs securely and successfully
Our team developing the data platform for digesting, cleaning, aggregating and storing terabytes of data per day
Our dedicated data scientist crew developing above state-of-the-art machine learning solutions to optimize the audience targeted by ad campaigns, the budget and the resource allocation
Data Analytics and Reporting
Our team dedicated to extract insights from our BI Platform and enable decision-making for all our customers and internal stakeholders
Tech blogMore resources
Putting your advertising budgets into optimal use
Optimizing the high-frequency ad auctions is certainly one of the most fascinating aspects of ad-tech. There is plenty to be optimized already before one gets to the auctions though. In order to participate in one, one obviously needs to have budget. Typically, the total budget is fixed, but split across many campaigns that target distinct inventory, such as distinct geographical markets, user profiles or websites, thus making them vary in price and performance. The performance of a particular campaign is irrelevant though. Only their aggregated performance matters. Thus the question is:
Essentially, this is a problem of budget allocation, also known as media planning and closely related to market mix modeling. Traditionally, solving it has very much been the responsibility of campaign managers who monitor the performance arbitrages and shift budgets around accordingly. This can be quite tedious and challenging manual task though.
This post is intended to share our experience of building an externally facing Business Intelligence solution in Tableau. Although it is specific to our use case, we believe that our approach and practices might be helpful for those who work on similar projects or anyone else developing complex Tableau based analytics products in teams.
Working with destination marketing organizations (DMOs), who usually need to adapt very quickly to changing circumstances, we focused on taking our reporting and analytics to the next level. The current situation with international travel restrictions has emphasized the necessity to constantly keep our clients up to date even more than ever. To address it we have launched a project to build a standardized BI solution for DMOs called Destination Suite.
Usually, launching this kind of project would mean leveraging embedded analytics as a part of a web application. However, in our case we decided to take another approach. We needed to produce feasible and sharable results very quickly without involving frontend and backend engineers from other teams, until we finalize the solution. To make it happen we joined forces with our product team and UI/UX designer and aimed at developing a set of feature-rich dashboards connected by a single navigation. No dependencies on other technical teams also made it easier for us to follow iterative development cycles and start receiving stakeholders’ feedback as early as possible.
This article represents our approach at Travel Audience to tackling deployments of applications to Kubernetes clusters which run 24h / 365.25 days.
The setup presented, unlike blog posts providing silver bullet solutions, is based on an opinionated approach with benefits and drawbacks. We are tackling the problem of complex software deployment management. Manifests and configuration files below can be used as an inspiration for your own, tailored setup. Enabling quick, continuous releases using Flux.
GitOps is one of many Frameworks existing in the software world. Apart from being new hot jargon, its primary assumption is to have a single source of truth - git is used to managing the whole application lifecycle.
At travel audience, we provide integrated data-driven solutions for travel advertising. Since we deal with terabytes of data per day, the selection of the right tools for data workloads is essential for us. Being on the Google Cloud Platform(GCP), we heavily rely on their most prominent technology — BigQuery. Though we use various other GCP components in our data teams, if I have to pick one component to recommend, it will be BigQuery without doubts. BigQuery is fast, powerful, and in most cases cost-effective.
This article from Google provides a good overview of BigQuery for a data warehouse practitioner. However, organizing data is not covered in much detail. This blog is focusing solely on this part — how to organize data in BigQuery for effective and compliant management across multiple teams in your organization. Each organization is different and Convey’s law is definitely applicable in data modeling. Still, we think this could be a starting point for anyone to extend on.
When we started with BigQuery, we were focusing mostly on “how to get work done” and not much on effective data management. But pretty soon, we ran into the following problems.
Meet the team
Our tech stack
Our hiring process
From sending through your application to the offer, here is our hiring process simply explained so you know what to expect.
We encourage our tech teams to take risks and develop without boundaries because we want our people to push for innovation. Looking for your next challenge? Don’t look any further and apply today!
Can’t find what you’re looking for? You can send your spontaneous application or just get in touch with us at firstname.lastname@example.org