Learn – Anki Developer https://developer.anki.com Sat, 02 Mar 2019 00:07:29 +0000 en-US hourly 1 https://wordpress.org/?v=4.9.5 https://i2.wp.com/developer.anki.com/app/uploads/2017/09/favicon.png?fit=32%2C32&ssl=1 Learn – Anki Developer https://developer.anki.com 32 32 135373019 Getting Started with Code Lab https://developer.anki.com/blog/learn/tutorial/getting-started-with-code-lab/ Thu, 07 Dec 2017 22:29:51 +0000 https://developer.anki.com/?p=1504 Like the Python SDK, Code Lab gives you easy, direct access to Cozmo’s advanced robotics and artificial intelligence (AI). The primary difference is that rather than having to learn the ins and outs of a text-based programming language, e.g., its syntax, you can simply drag and drop blocks around an intuitive touch interface. It really is that simple.

To get started using Code Lab, all you’ll need is a Cozmo robot and compatible mobile device. Just make sure you’ve updated to the latest version of the free mobile app to ensure that you have all of Code Lab’s latest features. Though not necessary, we recommend using a tablet over a mobile phone if possible as the increased screen real estate lets you see more code blocks at once.

Below is a list of helpful information and resources to help you get started using Code Lab. We encourage you to use it in as many ways possible—after all, Code Lab’s versatility is one its best features.

A video overview of Code Lab’s Sandbox and Constructor Modes.

General Information

What is Code Lab?
Code Lab is a graphical programming interface for Cozmo based on Scratch Blocks, a collaboration between Google and the MIT Scratch Team.

What can I do with it?
There’s so much that can be done with Code Lab, from creating new mini-games and activities to using Cozmo as an expressive robot actor in original short films. Be sure to check out the various sample projects included in the app for ideas and inspiration.

Can Code Lab be used in STEM education?
Absolutely. Better yet, not only can Cozmo be used to teach STEM education, many educators have already found him to be an exceptional educational tool due to his innovative hardware, expressive animations, and charming personality—all of which can lead to higher engagement from students. But you don’t have to take our word for it:

Using Code Lab

What’s the difference between Sandbox Mode and Constructor Mode?
There are quite a few differences between the two modes, but one notable difference is that Sandbox Mode is primarily icon-based, while Constructor Mode uses blocks that utilize more sophisticated programming features including variables, functions, and math operators.

  • Tip: Are you or your students ready to take the next step from Sandbox Mode? Try recreating a Sandbox Mode project in Constructor Mode as an extra challenge.

Sandbox Mode (left) compared to Constructor Mode (right).

Is there anywhere to see a list of all the Constructor Mode blocks?
There is a full block glossary contained inside Code Lab. To access it, simply tap the book icon found in the top-left portion of the screen while inside a Code Lab project. Alternatively, you can view the glossary in the official Code Lab forum as well.

  • Tip: For those times you don’t have Cozmo with you, you can use the online block glossary in the forums to plan projects and lessons in advance.

What is a Remix?
When you modify an existing Code Lab project, it is called a Remix. Remixing a project is a great way to use existing code as a foundation for your own. You can customize a project to your liking, add entirely new features to it, or even use the project as a piece to a larger, more sophisticated one.

  • Tip: Try adjusting some of the values / blocks in the included sample programs and Featured Projects in the app, and make note of how it changes the experience. How might you make it even better?

Where can I discuss / get help for / show off my Code Lab projects?
The official Code Lab forum is the best place to connect with other users, as well as members of the Code Lab team. If you’ve made a project you’re particularly proud of, submit it to us to potentially have it featured in the official Cozmo app!

  • Tip: If you’re a parent or educator, share your lesson plans, ideas, or curricula with the rest of the Code Lab community to get valuable real-world feedback.

Additional Resources

  • Code Lab F.A.Q. — An in-depth list of frequently asked questions that is regularly updated.

  • Best Practices for Code Lab Projects — Want to have your Code Lab project featured in the Cozmo app? Here are some guidelines that will help make your project the best it can be.

  • Using the Cozmo SDK in Education — Code Lab, in conjunction with Python SDK, make Cozmo a flexible education platform that scales from K-12 all the way through postgraduate research and beyond.

]]>
1504
Finding Color with Cozmo https://developer.anki.com/blog/learn/tutorial/finding-color-with-cozmo/ https://developer.anki.com/blog/learn/tutorial/finding-color-with-cozmo/#comments Fri, 08 Sep 2017 19:35:48 +0000 https://developer.anki.com/?p=1163 One of the more recent additions we made to the Cozmo SDK is the ability to pull color images from Cozmo’s camera feed. This opens up a world of possibilities with the SDK including color-dependent games and behaviors. In our SDK example program, color_finder.py, we have Cozmo search his environment for a user-specified color, and then pursue it.

The Color Finder program in action.

Max Cembalest, our wonderful summer intern, wrote the program and will take you through his approach in creating it.


Hi, I’m Max! I’ll show you step by step how to make Cozmo recognize and chase a tennis ball.

There are four primary steps needed to create this behavior:

  1. Define a threshold for which an arbitrary RGB value—of which there are literally millions—will be defined as simply, say, yellow.

  2. Locate “blobs” of the same color which, in this example, represent an object of that color.

  3. Calculate the rotation angle needed for Cozmo to face the color object.

  4. Managing the various action states of scanning for a color object, rotating Cozmo towards said object, and then moving towards it.

1. Approximating Color

For a first pass at determining color, I wrote a method on_new_camera_image that continuously captured Cozmo’s camera image many times per second as the event EvtNewCameraImage was triggered. Then I grabbed the RGB values of the pixel in the center of the image, and passed those RGB values into a custom method called approximate_color:

def approximate_color(self, r, g, b):
    max_color = max(r, g, b)
    extremeness = max(max_color - r, max_color - g, max_color - b)
    if extremeness > 50:
        if max_color == r:
            return "red"
        elif max_color == g:
            return "green"
        else:
            return "blue"
    else:
        return "white"

approximate_color returns the maximum of {r, g, b} if that maximum is sufficiently larger than the other two colors, in this case, by a value of 50 or more; otherwise, the pixel is declared “white.” I tested the method by pointing Cozmo at blue and red images (Figure 1), and then printing out the result (Figure 2).


Fig. 1 The blue and red images used to validate the approximate_color method.

Fig. 2 The live output from the approximate_color method.

While the method worked well, it was only useful for simple images with clearly defined color boundaries. When using arbitrary camera images of the real world, the method consistently detected too much red. To mitigate this issue and make it easier for Cozmo to eventually recognize distinct color blobs, I downsized the camera image from 320×240 to 32×24 using a method from the PIL library and then used the results from applying approximate_color on the reduced image to create a 32×24 matrix (Figure 3).

Fig. 3 The 32×24 pixel matrix used to find color blobs.

2. Locating Color Objects

To locate color blobs in Cozmo’s view, I started with the standard connected-components algorithm, which counts how many distinct groups of connected equal-valued squares there are in a matrix.

A modification to the algorithm was needed, though, to uniquely identify each blob and keep track of which points are actually in the blob. To achieve this, I created a dictionary of unique {key : value} pairs, with each key being a unique number and each value being a list of points in the corresponding color blob.

I made a new BlobDetector class which took pixel_matrix from the ColorFinder class as input. The following algorithm determines, for every point in pixel_matrix, which blob to add it to:

  • If the point matches the color of the point immediately to the left, we add it to the blob of that left point.
  • Likewise, if the point matches the color of the point immediately above, we add it to the blob of that above point.
  • If the point matches neither, we make a new blob for that point.
  • But if the point matches both the above and left points, then we merge the left point’s blob and the above point’s blob (once we check that the left point and the above point don’t already belong to the same blob).

For each blob that the algorithm found I averaged the x and y value for all the points in that blob to approximate where the center of the blob resides in pixel_matrix and printed out each (x, y).

To make this process easier to visualize and debug, I made the ColorFinder class inherit from cozmo.annotate.Annotator. Now I could draw the contents of pixel_matrix onto the viewer window.

The coordinate system of the pixel_matrix starts with (0,0) as the top left point, and (31,23) as the bottom right point, so x values increase from left to right and y values increase from top to bottom. Even though the colors and shapes that Cozmo could detect were a bad approximation, the BlobDetector algorithm was working well: the (x, y) coordinates that printed out as Cozmo scanned the movement were correctly represented when I was moving the ball (Figure 4).

Fig. 4 Cozmo correctly tracking color.

3. Turning Towards the Blobs

Cozmo’s camera has attributes fov_x and fov_y which are angles that describe the full width and height of Cozmo’s field of view. So any perceived horizontal and vertical distance between a blob’s center and center screen can be expressed as angles which are fractions of fov_x and fov_y, respectively.

The exact values of fov_x and fov_y vary slightly between Cozmos because of minor manufacturing variances, but they are roughly 58 and 45 degrees, respectively. In this example image, the center of the blue blob is calculated to be 17.98 degrees to the right and 11.25 degrees up.

I called the following method every time I calculated the blob center. It uses the SDK methods robot.set_head_angle and robot.turn_in_place, which return Actions that activate Cozmo’s motors:

def turn_toward_blob(self, amount_to_move_head, amount_to_rotate):
    new_head_angle = self.robot.head_angle + amount_to_move_head
    self.robot.set_head_angle(new_head_angle, in_parallel = True)
    self.robot.turn_in_place(amount_to_rotate, in_parallel = True)

I set Cozmo’s color_to_find to blue, and tested out the tracking using a blue notepad. It was clear the approximate_color method wasn’t strong enough to pick up all the blue (Figure 5).

Fig. 5 Tracking a blue notepad.

I noticed Cozmo’s reaction time was lagging a bit, but soon realized that I was essentially clogging Cozmo with conflicting actions. For instance, when I thought Cozmo should be turning left, he was completing a right turn because of a robot.turn_in_place call from 2 seconds ago.

To address this, I needed to make sure that Cozmo’s movements were only in response to current accurate information from his camera. So I assigned variables to store the current state of the action calls. This allowed me to only start action calls once all prior actions had been aborted.

I gave the ColorFinder class the attributes tilt_head_action and rotate_action, and added a new method abort_actions that aborts its input actions if they are active. My turn_toward_blob method now looked like this:

def turn_toward_blob(self, amount_to_move_head, amount_to_rotate):
    self.abort_actions(self.tilt_head_action, self.rotate_action)
    new_head_angle = self.robot.head_angle + amount_to_move_head
    self.tilt_head_action = self.robot.set_head_angle(new_head_angle, in_parallel = True)
    self.rotate_action = self.robot.turn_in_place(amount_to_rotate, in_parallel = True)

This resulted in a much better response time.

A Note on Color Ranges
Colors in RGB space have three components between 0 and 255, allowing you to place all the colors as points inside a cube. This allows you to define the “closeness” of colors as the Euclidean distance between them as points inside the RGB cube.

As you can see from the picture, there are large chunks of the cube that should satisfy the high-level condition to be “red,” “green,” or “blue.” So I defined a set of color ranges in the format (min R, max R, min G, max G, min B, max B).

To approximate a color, I measure its distance from all of the color ranges and settle on the range whose distance was shortest. Measuring the distance from a color to a color range is the same as measuring the shortest distance from a point to a cube. Because we are looking for the smallest distance, we can compare the squared distances and save having to calculate the square root.

While not perfect, it reduced the amount of red noise in the background color, and individual objects became a bit more separated in the approximation — enough for Cozmo to locate and chase distinct colors.

4. Managing Action States

To get Cozmo to drive towards an object that matches his color_to_find, I wanted a way to move between the following three states:

  • Look around: Cozmo does not see any blobs that match his color_to_find, so he runs the LookAroundBehavior.
  • Found a color: Cozmo sees a blob that matches color_to_find, but is still rotating and tilting his head to get the blob to be stable in the center of his view.
  • Drive forward: Cozmo sees a blob that matches color_to_find, so he drives forward.

To do so, all that was needed was to add an attribute to the ColorFinder class called state, and at key points in the life cycle of the program, call the appropriate reaction based on the current value of state.

Changing between look_around_state and found_color_state can be done purely from within on_new_camera_image, since those transitions are based on what BlobDetector is picking up from the continuous camera feed.

The transitions in and out of drive_state are different, though. After every one second elapsed, I would check the total sum of all the angles Cozmo had turned in that second. He only drives forward if that sum is below a threshold. Otherwise, he would remain in found_color_state.

Cozmo could drive now, but the program didn’t work with the color yellow yet as as I needed to further reduce the amount of overall red and yellow in the image—the most common colors found in the camera image due to my particular lighting conditions.

Last touches

Color Balancing
I found an algorithm called Gray World which balances the color in an image by reducing the average color across the whole image to a neutral gray. Using it lowered the overall amount of red and yellow. It isn’t a perfect color-balancing algorithm because it may not be the case that the average color across the image should actually be perfectly gray (e.g., in an image of the sky the average color might be blue), but for my purpose, it was more than sufficient.

Remembering past blobs
Whenever Cozmo finds a blob, he stores the coordinates of the blob’s center in a variable called last_known_blob_center. So when no blobs are visible, Cozmo turns in the direction of last_known_blob_center, but with the magnitude of rotation scaled up. That way, he may slightly overshoot the object, but find it again while rotating past it.

Converting RGB to HSV
The last modification I made was processing color in HSV (hue, saturation, value) instead of RGB. The hue component specifies the color, the saturation specifies how much white is in the color, and the value specifies the brightness.

When I calculated color distance using HSV space instead of RGB space, the tennis ball was interpreted as almost entirely yellow, whereas before only the brightest parts of the ball were seen as yellow. That is, using HSV can be very useful in identifying the same color in dynamic lighting conditions.

In Closing

An application like this is never finished. There are always refinements to be made, features to added, and so on. Off the top of my head, I’d love to see someone take the code and have Cozmo white balance the image when the program starts, using the known color of a marker object. Another idea I’d love to see implemented is to evolve the program into a game of fetch, where Cozmo pushes the ball back to the place he started at.


You can get the color_finder.py example program, as well as all of the others, from our Downloads page. Give it a go, and let us know what you think in the comments below!

]]>
https://developer.anki.com/blog/learn/tutorial/finding-color-with-cozmo/feed/ 5 1163
Getting Started with the SDK https://developer.anki.com/blog/learn/tutorial/getting-started-with-the-cozmo-sdk/ Fri, 01 Sep 2017 22:48:27 +0000 https://dev-developerankicom.pantheonsite.io/?p=985 We designed the Cozmo SDK to be as simple to use as possible, while also giving you access to powerful computer vision and robotics technologies such as facial recognition and path planning. If you have some experience programming or are comfortable using a command-line interface, getting the SDK up and running should be quick and easy.

Required Hardware

It’s helpful to understand how the SDK works before getting started. First, the SDK itself runs on a host computer. Second, your mobile device runs what we refer to as Cozmo’s engine (think of it as his “brains”), and must be connected to your computer via USB. Finally, Cozmo connects to your mobile device via his own secure WiFi network. The following diagram illustrates the full SDK setup:

Hardware setup for the Cozmo SDK.

One benefit of this arrangement is that is allows you to connect Cozmo to any number of third-party libraries, via your computer’s network connection. You can run our own Twitter example program to see an example of this.

The full installation instructions will vary depending on your specific hardware setup. The video below guides you through a Windows / Android installation, but you can find other desktop and mobile OS tutorials on our official developer YouTube channel.

Android / Windows installation tutorial.

Once you’ve installed the SDK, we highly recommend running the official example programs to confirm that the SDK installed correctly, and as a way to check out many of the SDK’s features. If you don’t already have a favorite Python integrated development environment (IDE), many of us here at Anki use PyCharm Community edition. It’s free, open source, and has all of the features you need to develop using the SDK.

Next Steps

Some SDK users come in with a clear idea of what they want to create, while others simply want to utilize projects made by the community. That’s the great thing about the SDK — it can be a sophisticated development platform, or a way to augment your personal Cozmo experience through others’ creativity and work.

One SDK program that’s especially useful for both developers and casual users is the Cozmo Explorer Tool. It provides a simple interface to, among other things, see Cozmo’s camera view in real time, play any of Cozmo’s hundreds of animations, and control his movement / motors via a traditional WASD keyboard control scheme. This should help you get a sense of Cozmo’s capabilities very quickly.

The Cozmo Explorer Tool by @GrinningHermit.

After getting acquainted with Cozmo and the SDK, it’s really up to you where to go next. You could create a new game, use Cozmo as an actor for your YouTube films, or even tweak our example programs as a way to dive deeper into Cozmo’s advanced features. Whatever you do end up creating, be sure to show it off in our Showcase forum. Good luck!

SDK Resources

  • Installation Videos — Full video walkthroughs for iOS, Android, Windows, and macOS.
  • SDK Forums — Have a question or feedback about the SDK? Simply want to show off your new project? The forums are the place to talk directly to us and the rest of the community.
  • User Projects — Want to see what people have been doing with the SDK? Our curated playlists have everything from short films to new games.
  • Documentation — Official technical documentation for the SDK. Includes a complete API reference, links for example programs, and much more.
  • F.A.Q. — We’ve compiled a list of frequently asked questions regarding the SDK.
  • PyPI — The Python Package Index is a massive repository of third-party Python software.
]]>
985