The official Speechly Quick Start helps you get started on developing with Speechly.
This Quick Start will guide you through the basics of building Spoken Language Understanding models with Speechly. It covers the following steps:
The best way to start developing with Speechly is to complete this Quick Start.
The first step is to navigate to the Speechly Dashboard in order to create an account, and accept the terms and conditions. Only a select audience currently has a link to our Dashboard due to us being in Private Beta mode; however, do sign up on the waiting list at our website to gain early access.
After creating a user account, you will land on the Speechly Dashboard main page, where you manage your applications.
Proceed to create a new application by clicking the blue Create application button.
When creating a new application, you first name it and choose the language for the speech recognition model. You can initialize your application by selecting a ready-made SLU Example template, or you may start with an empty template.
Custom acoustic models
Now that you have created your first application, you can start editing the SLU Example configuration. The examples created here are used to configure your Spoken Language Understanding (SLU) model. Therefore, take your time to come up with a good set of examples, as this is the most important part of building a well-working voice user interface.
Speechly is meant for building voice applications that can be controlled in natural language, not only with predefined commands. The examples you provide for the model are not meant to include every possible utterance your users can say. Rather, the aim is to provide our machine learning algorithms with enough sample data to build a smart model that can recognize the intent and entities even from utterances that have not explicitly been configured.
The SLU example definition works by defining example utterances, and annotating them to specify their intents and entities. We start by defining the intent, after which all the entities are annotated. For instance, in the example below, *order defines that the intent of this specific utterance is that of ordering. This example utterance has two annotated entities: “pizza Margherita”, which is an entity of the type product, and “extra cheese”, which is an entity of the type topping.
The intent name is not part of the utterance
Even with a simple model, it is recommended to write several annotated example utterances in the SLU examples. For more information about writing SLU examples, please read Editing SLU Examples.
Here is a copypasteable example template for you to try if you don’t want to configure your own examples just yet. This example defines a home automation application that can be used to control lights in two rooms: kitchen and living room.
*turn_on switch on the [living room](location) light
*turn_on turn on the [kitchen](location) light
*turn_on turn on the [kitchen](location) lights
*turn_on put the [living room](location) light on
*turn_on put on the [living room](location) lights
*turn_on turn on the lights in the [bedroom](location)
*turn_on switch the [kitchen](location) lights on
*turn_on turn the [kitchen](location) lights on
*turn_on put the lights in the [living room](location) on
*turn_on bring up the [living room](location) lights
*turn_on let there be light in the [kitchen](location)
*turn_on [kitchen](location) lights up
*turn_on [living room](location) light on
*turn_on let's turn on the [bedroom](location) lights
*turn_on turn on the lights in [kitchen](location)
*turn_on turn the lights on in the [living room](location)
*turn_on make the [kitchen](location) lit
*turn_on lights on in [bedroom](location)
*turn_off switch off the [living room](location) light
*turn_off turn off the [kitchen](location) light
*turn_off turn off the [bedroom](location) lights
*turn_off put the [bedroom](location) light off
*turn_off put off the [living room](location) lights
*turn_off turn off the lights in the [bedroom](location)
*turn_off switch the [kitchen](location) lights off
*turn_off turn the [kitchen](location) lights off
*turn_off let's switch the lights in the [living room](location) off
*turn_off bring down the [living room](location) lights
*turn_off darken the [kitchen](location) for me please
*turn_off [kitchen](location) lights down
*turn_off [living room](location) light down
*turn_off i want the [bedroom](location) lights off now
*turn_off turn off the lights in [kitchen](location)
*turn_off turn the lights off in the [living room](location)
*turn_off make the [kitchen](location) dark
*turn_off lights off in [bedroom](location)
Advanced SLU Examples
When you are happy with your new SLU example configuration or you’ve copypasted the example above, you are ready to deploy your application and test the configuration. Once you click Deploy, the new SLU Examples are used to train the SLU models, and the newly trained model is deployed. So, you click on Deploy and wait a while until the model is trained. This usually takes less than a minute, but with bigger training sets it can take significantly longer.
If there are errors in your configuration, you need to resolve these before being able to publish your application. Once there are no errors in the configuration, the Try button below the SLU Example configuration pane becomes active. This button opens the Speechly Playground, where you can test your new SLU configuration in practice and share it with your colleagues or other shareholders.
The results show how the SLU model translates spoken utterances into text, and which intents and entities it identifies. While speaking, you will see the results appear in real-time on the Speechly Playground. This should give you some ideas on how to leverage the real-time SLU results in the UI of the client you build on top of the SLU application.
So for instance, if you say, “Turn off the kitchen light” with the example provided earlier, the model should return the intent turn_off and the entity kitchen for the entity type location. Now, if we integrated this, for example, into an iPhone app, you would know that whenever the Speechly API returns these values, the app would turn off the lights in the kitchen.
The API also returns the utterance transcription, which can function as feedback to the user or be ignored. Most of the time, it’s enough to get the intent and entities correct.
You can share the link to this Playground by clicking on Share and copypasting the link. When sharing the model, make sure your users know what kind of utterances they can say to the model. For example, tell them they can control lights in different rooms. In return, ask your users whether the results they got were reasonable. If not, ask what they said and update your model to include those utterances too.
The last step is to connect the SLU to your own client. To do this, you can use one of our client libraries. The model is integrated to your application with the app ID of your model. You can see the app ID in the configuration view and the settings of your application. Happy developing!
Voilà, you’re done with the quick start! Now you can proceed with the browser client library quick start.
Last updated by karoliina-louhema on June 18, 2020 at 10:56 +0300
Found an error on our documentation? Please file an issue or make a pull request