Demos – The Website
Many technical solutions are related to a website or web based solution. And when doing a demo of those solutions it helps to emulate that website to show the system interacting with. It might be showing how Fastly can log everything that happens at the edge, or showing a customized Conversation AI assistant from Cognigy. For Sales Engineers they will need to duplicate or emulate a website as part of their demo. This reducing the distraction of context switching and also allows the viewers to visualize better how they could use what they are seeing.
How you emulate the website will depend on how you need to interact with it. I’ll walk through a few options for cloning a website to demo against here. First it is good to cover why the website is important. Then I’ll cover the different methods from the most simple to the most complicated.
Why Emulate a Specific Site
A big part of the demo is story telling. You want to show that day in the life, and how the solution makes someone’s day easier. Of course a big part of that story is lowering costs or increasing revenue. But either way, that story will resonate more if you are able to at least show a site within the same industry. Even better if you can show the experience with the solution deployed on their website. Less “context switching” for them. They can visualize the solution better.
The simplest is to just take a screenshot. In windows, I load up the page and maximize the web browser. On a Windows system you can use Sniping tool. Once installed it is Windows Key + shift + S. Cut just inside the browser. For MacOS, it is shift + command + 3. Copy the file onto your web server. The following HTML is enough to have it stretch and fill the screen:
A Bit More Complex
For wget, I find that the following flags are helpful:
- -r for recursive.
- -np for no parents. Use this if you just want to crawl from that directory and down.
- -w <# of seconds> to wait between requests. Use this if you decide to set the User Agent to something else to get around robots.txt rules.
For more details, check out this wget tutorial.
With httrack, the following setting are relevant:
- Go down only, not up. Similar to -np for wget.
- Ignore agents – be kind if you do this.
- There also are settings related to older http requests.
In the past, there was a WordPress theme site where you could download demonstration versions of the theme. They were minimalistic and just enough to get a nice static site up. Unfortunately I can no longer find them, but that brings us to the last option.
The most complex, to me, is to set up an instance of WordPress and build different sites based on the site you wish to emulate. This is nice when you want to have a site per industry, but change the name and colors based on the firm you are speaking with. It is a bit of work to create all of the content of course. Note that you can run multiple sites on the same Apache instance as well as with the same database.