Archive

HoloLens

Introduction

I decided to build a next generation IDE targeting the HoloLens platform.
This project is code-named: Holoware

Framework Requirements

The IDE will serve as a framework for building applications using the following workflow:

• Identify architectural layers

• Construct modules for each layer

• Construct classes for each module

The IDE will leverage both keyboard and gestures throughout the user-experience of the software development process. Gestures can be used to manipulate panes (i.e. expand, shrink) and drill into and out of an architecture (i.e. layers, modules, classes).

Holoware as a vehicle

This IDE will attempt to leverage the metaphor of driving a car. However, instead of driving a car, the targeted user-experience for this IDE is to serve as a development vehicle for driving and monitoring code.

Do the Following:

Get behind the steering wheel of a car and observe the user-interface. Specifically, observe the design of how that vehicle enables you to observe real-time events. What do you see?

Windows

A car has several windows (also known as panes) (i.e. windshield, side-views, and rear-view). The rear-view window enables a user to view their past as it relates to their journey. The side-view windows enable a user to view activity in parallel relative to their vehicles progress.

Mirrors

In addition to the windows on a car, a car also has mirrors that are tightly coupled to the position of these particular windows (i.e. rear-view, drivers-side, and passenger-side). These mirrors have dual functions. Not only do they reflect activity, but they do so through the unique perspective of their location relative to their position of a window. More specifically, while a window enables the observation of an activity, a mirror relative to that window provides further detail (i.e. history or a log).

Dashboard

A driver of a vehicle leverages a dashboard to monitor that vehicle’s health. This IDE will attempt to follow that metaphor.

Conclusion

You can find the Holoware project on Github.

Introduction

We are beginning a new era of user-experiences that will soon be introduced to us via augmented reality. This technology can introduce a new perspective of the tooling that can be leveraged to design and build the software systems of the future. Specifically, the software industry can now collaborate on how to design the future user-experience of building software systems through augmented reality.

This article will be the first of several articles that detail the possibilities of future software development.

Areas of Enhancement

Instead of common software development involving a keyboard and a dual-monitor setup, imagine building software with just a keyboard and HoloLens. In addition, imagine your future IDE as a development environment that enables software development and maintenance at various levels of abstraction. For example, what if developers had an architectural view of their code and were able to drill-in to a specific layer of their architecture’s abstraction by selecting a layer (i.e. UI) to view its modules or by zooming out of that particular abstraction all while using their augmented reality headset.

Here’s another example. What if a developer could place various panels (i.e. Test Explorer, Output Window, CodeMap, and CallStack) at different depths of their view or dock them to different corners of their view relative to their text editor. Again, using augmented reality, a developer can zoom in and out from their IDE’s range of views. A developer can also look to the left, right, up, and down to interact with other sets of tools and data (i.e. windows for debugging) as they pertain to the IDE’s text editor.

In short, there are several areas within the user-experience of building software that can be redesigned. The areas that this series will discuss are Notifications, UI Module Placement, and Development Views.

The following is a summary of the areas that will be discussed in the upcoming series.

  • Notifications
    • Automated Tests
    • Work Items (defects, tasks)
  • UI Module Placement (docking)
    • Test Results
    • Output Window
    • Application runtime
    • Editors
      • Designer ( Blend)
      • Code (Visual Studio)
  • Development Views
    • Classic
      • Prior versions (Visual Studio)
    • Architectural
      • UI
      • Services
      • Model
      • DAL
    • System
      • Clients
      • Servers (Build, Database, Source Control)

Conclusion

In conclusion, we are beginning a new era of user-experiences that will soon be introduced to us via augmented reality. This technology can introduce a new perspective of the tooling that can be leveraged to design and build the software systems of the future. Specifically, the software industry can now collaborate on how to design the future user-experience of building software systems through the lens of augmented reality. The coming articles that this series will discuss will be Notifications, UI Module Placement, and Development Views.