Headquartered in the greatest city in the world (New York) The International Culinary Center (ICC) is one of the leading culinary schools in the US. They offer professional culinary, pastry and wine education to students around the world. The ICC has only the best chefs teach the techniques and skills necessary to succeed as a chef in the food and wine industry. AdVenture Media has been working with The ICC for over three years, and our relationship continues to evolve with the shifts in the landscape.
The ICC first hired AdVenture Media more than three years ago to take responsibility for all digital advertising. The most important metric we had to optimize for was cost per lead (CPL). But for the ICC, cost per lead is a complicated KPI. The ICC offers 20+ different programs, courses, classes and enrollment options, and each specific option has a different value, profit margin and target cost per lead.
To make matters more complex, the funnel for each program was different.
For example, the Sommelier program may have the following funnel close rates (these are not actual numbers):
While the numbers for the Culinary Arts program may look entirely different at every stage of the funnel.
It was apparent this account would not be simple or easy to optimize and grow. Leading up to The ICC hiring AdVenture, CPL's were rising and profitability was dropping.
The second major challenge was the sheer size and complexity of the ICC's website. The site has over 150 pages of content. We had to decide where the traffic from our campaigns should be sent, and we needed to decide if the campaign traffic should be sent to stand alone landing pages or to primary program pages on the main site.
A massive campaign restructuring and overhaul got underway, but at the same time we brought in our UX and Conversion Rate Optimization team to get to work on optimizing for conversion rate. As we've written about previously, improving conversion rate is the most cost effective way of decreasing cost per lead or cost per conversion.
The client already had stand alone landing pages for specific classes and programs.
We felt that the existing landing pages were too thin on content and didn't give the visitor enough relevant information, but the primary program pages on the main website were too disorganized and distracting.
We presented a strategy to the client for building and designing new landing pages over a 3 month period for each of the programs. We'd build the landing pages on a framework that allowed us to run A/B tests on copy and design, a crucial element of effective conversion rate optimization (if you're not testing, you're guessing).
Based on a cost benefit analysis, we decided (along with the client) we'd begin with redesigning landing pages for five of The ICC's programs:
Our goal was to focus on four main conversion strategies in our landing page redesigns:
This is a screenshot of the client's original landing page:
You can see that while the content is readable, the production quality of the page is low. We also predicted that serious prospects would want more on-page information and content than what being offered here before submitting a sales form (which asked for a phone number and email).
Our first variant can be seen here:
As you can tell immediately, the page is much longer. There's no value in a short page for businesses offering products and services that represent a meaningful decision in people's lives. Our testing has shown customers and prospects want more information than is typically provided. The page includes more information about the specific program, enrollment process and the awards and accolades that make The ICC one of the preeminent culinary institutions in the world.
Instead of A/B testing a headline, we decided to "test big" and test and entirely different page variant. While this created more work on our part, it's better to test large structural variants than test an individual headline at the start of an A/B testing schedule. Here is a look at our second variant:
To an untrained eye, the core differences between these two variants may be hard to spot. After all, both pages are long pages with a lot of copy and more images. Both pages have a high level of production quality.
Here's where the pages differ fundamentally:
There were more changes, but those are the most fundamental variations likely to account for a variance in conversion rate.
Not every test goes your way. If you find every test going your way, you're probably not testing enough, or you're not testing significant variations.
We were sending 25% of traffic to variant A, 25% of traffic to variant B and 50% of traffic to the client's original landing page.
We found conversion rate to be relatively flat, but across mobile specifically, conversion rate was actually worse on our newly designed pages (both of them).
Before scrapping the new pages entirely, we decided to run usability tests to make sure we didn't miss anything important or fundamental (which of course can happen from time to time).
We took screenshots of the original landing page, and both variants of the new landing pages.
We sent out the survey to 100 people in the US, ages 25-40 and asked them to identify their current work position from a list of choices. We also asked them to identify their gender as male or female.
Before seeing any image, they were asked:
"Imagine that you are interested in enrolling or learning more about culinary school in New York. You search Google for "culinary schools near me", click an ad and come to one of these two landing pages. Select the landing page that you like the best."
They were then shown and image of the original landing page and an image of one of the two variants we designed. They had the ability to zoom in and read the copy etc.
After they chose which design they preferred, we asked them two follow up questions:
We compiled all the raw data into a spreadsheet, allowing us to analyze the results.
Our first usability test indicated that we didn't miss anything fundamental. Users knew what the page was about, they thought it had a higher level of production quality ... but the conversion rate (especially on mobile) was worse.
We put a great deal of emphasis at our agency in understanding data. We want to know why something is happening; knowing that something is happening isn't enough. We didn't have empirical evidence to support any data driven explanation, so we thought long and hard, putting ourselves in a visitors' shoes, and tried to hypothesize as to why the newer "nicer" landing pages weren't converting as well the original landing page.
After much thought and discussion we realized that the original landing page was so scarce and thin on content that it gave the immediate impression of being a standalone landing page.
Meaning, a webpage not navigable to from The ICC's main landing page. In fact, that was exactly the case.
These landing pages are hosted on subdomains. These landing pages cannot be found by using the site's normal, global navigation. Our hypothesis was that the visitor felt a degree of implied urgency when sent to a page that "couldn't be found again."
Almost like finding a treasure you don't know how to navigate back to. Maybe visitors were more inclined to convert today specifically because they felt that tomorrow they wouldn't be able to find this offer again.
The new landing pages we designed looked and felt like fully designed landing pages that could very well be part of the ICC's main website.
These landing pages were also standalone landing pages, but they felt like regular inner pages because they were more professionally designed and laid out. If visitors felt these pages were ones they could find later, or tomorrow by visiting the ICC's primary site, we'd be losing that sense of urgency and hence the drop in conversion rate.
We decided to run a new usability test to determine if our theory was accurate.
All tests were subject to the following constraints:
The purpose of the test was to see if were right in assuming that a typical user would perceive the old landing page to be a standalone landing page, not navigable to from the main site.
It was important to simulate the experience of landing on the page through an ad. The sense of urgency we are suggesting would only really apply if the user first got to the page from an ad click (a link that would not be easily findable again in the future).
First, respondents were shown a blank page that had this copy:
Imagine that you're a student interested in attending Culinary School. You need more information about culinary schools near you, so you go to Google on your smartphone, and search for "Culinary schools near me" ...
After clicking through to the next step of the test, they were shown the following text above an image of the search results:
You see an ad for the International Culinary Center in New York, which is near where you live! You decide to click on that ad to see what they have to offer. Go ahead and click on the second ad!
Here's the image they saw:
Here's a heat-map to prove that our respondents were actually following directions:
After clicking on the second ad on the page (the actual click on the image helps simulate the experience better), respondents were then presented with either the mobile version of the new page or the mobile version of our old landing page (again, the page was in an iPhone and scrollable).
Before being asked the question, we showed them this copy:
After clicking on the second ad, your mobile phone brings you to the following page ... take a minute to scroll through the entire page, and be aware of what the design of the page is telling you.
Then, they were asked one multiple choice question:
Let's say you're not ready to submit the form on the page yet. How would you get back to this page again once you left the page (or closed your browser)?
(The order of the two options were randomized, to account for any sort of selection bias).
For the new landing page test, a whopping 80% of respondents chose that this is a regular web page and they can find it again by going to the homepage, and 20% chose that they'd need to find the ad again.
The test results supported our theory that the original landing page communicated a sense of urgency that the new, "nicer" landing pages did not.
With our theory supported, we decided to redesign the new pages with a mobile first design strategy, and simplify the design to give the impression to visitors that these pages were indeed standalone landing pages.
Here's an example of one of those newly designed pages:
Over time, we designed over 25 of these pages. The results have been overwhelmingly positive. Without proper testing, theories and usability analysis these results would never have been possible.
After redesigning our new landing pages to look simpler and more scarce (without sacrificing production quality) we saw conversion rates skyrocket across the spectrum of programs and classes. Instead of launching the new pages fully, we originally sent 50% of traffic to the new landing pages and 50% of traffic to the original landing pages. Once we had statistically significant conversion data, we transitioned 100% of online traffic to our new landing pages.
Here's a sampling of conversion rate results across four main programs:
Professional Culinary Arts
Original: January 15 - July 6: 10.79%
Revised: July 26 - January 14: 9.62%
Overall Change: +12.11%
Professional Pastry Arts
Original: January 15- July 6: 11.21%
Revised: July 26 - January 14: 9.91%
Overall Change: +13.06%
Intensive Sommelier Training
Original: December 25- July 6: 16.58%
Revised: June 14- December 24: 8.45%
Overall Change: +96.20%
Art of International Bread Baking
Original: February 26- July 6: 16.97%
Revised: October 18- February 25: 4.42%
Overall Change: +284.06%
We'll get back to you within a day to schedule a quick strategy call. We can also communicate over email if that's easier for you.