Summary
This week was the week of midterm presentations and open house demos. You can view our slideshow at the end of this post. Over the weekend and Monday holiday, we worked on combining together what each team member has been working on into a single project. This included modifying the windowing system to support three layers (background, text, and drawings) so that all features could be accommodated. The text layer was configured to display text in the way we displayed it in last week’s blog post, and the drawing layer was configured to be driven by the graphics script we developed (while being fed raycasting events to know where to draw). We also merged in file transfer successfully, which enabled us to drag over text files and images (PNGs and JPEGs).
Progress
The main focus for this week was putting all the pieces together so that we’d have a single app to demo. Until this week, windows could only display images. To add in the drawing functionality and text display, we decided to use a three layer structure nested under a single Unity GameObject. We noticed that there are similarities between drawing and displaying text and images. For example, both drawing and text display require a background. As a result, we made the background as a layer at the bottom of the three layers. The default background for both of them is white. The text display feature is placed in the middle layer and the drawings appear in the top layer. Both the text layer and the drawing layer are transparent when there is no content. This makes it easier to display images: We can just replace the white default background with the image we got through the network.
Initially, an object without an image or text will be a whiteboard for drawing. Depending on the type of content we receive through the network, the app will either replace the white background or add text input to the middle layer.
Whenever we display content from a text file, it has a white background that resembles a sheet of printer paper. We achieved this by figuring out how much to resize the object and what to set its font size to.
Other than the features in the demo, we also developed another feature later this week, which is enabling third party prefab objects to also use our script. In order to do so, we needed to make our script more flexible. Specifically speaking, we had to make several additional options for the user to customize it for it to be fit on different objects. With our script attached to them, these objects can also enjoy the features such as dragging and rotating as well the close button. In this way, we can easily extend the range of things that can be in our workspace. We can easily import something from the assert store and include it in our app. For example, we first implemented the clock widget by applying the script to a clock from the app store. We did have to make adjustments in order for the transformation and time display to be right. In the future, we plan to have more widgets similar to the clock.
For our midterm presentation, we recorded three demo videos, which are included below. In the first one, we demonstrate how file transfer works. From the Windows file explorer, we use the new “Boundless workspace” option in the context menu’s “Send to” submenu to copy over files. First we send photos of two staff members, and then we send two text files.
In the second video, we recorded a demo of how drawing on the whiteboard looks like. Currently the pixels rendered onto the whiteboard are very sparse due to the slow update rate of the callback function that takes inputs from the Magic Leap controller. Although we have tried to increase the rate of the callback function, the sparsity in pixel values rendered onto the whiteboard was not reduced significantly enough. In order to solve this problem, we will be interpolating the sparse points as we are drawing.
In the final video, we take a closer look at text display, and also demonstrate how to close windows. The quality of the actual recording makes it a little hard to tell, but the text is easily readable. Not only that, but emoji are supported by the text-display solution we’re using, which we think is pretty cool.
Finally, if you’d like to view our midterm presentation’s slides, we’ve included them below.
Plan for next week
Next week, we’ll be out of town on the Thursday and Friday due to Thanksgiving break, so we don’t expect that we’ll be able to put as much time into this project as in past weeks. But rather than give a description of what we expect to accomplish next week in this section, we’ve decided to list what we’re planning to work on for the rest of the quarter. Items are grouped by the team member who plans to take the lead, with higher priority items appearing first.
Paul:
Peyton:
Alan:
Tianqi:
0 Comments
|
Blog
|