Details
Description
The mediacast demo needs an upgrade to illustrate some of our
device integration features (augmented reality) and serve as
a JavaOne demo, possibly as part of a contest.
The main difference is actually with the "message" concept
in the application. Currently a message is just a photo,
video, or audio clip with a title and some thumbnails. In
the new version:
A media message will contain the following
- title (auto-generated if not provided)
- photo (optional)
- video (optional)
- audio (optional)
- location and direction (optional)
- text paragraph (optional)
- tags (optional)
If a photo is not provided, a frame from the video will be used
(as we generate video thumbnails now).
The user has a current set of tags (space separated in a text field).
They will see only messages with those tags and will create messages
with those tags. (Possible feature: some tags are reserved, such
as <1km to provide filtering features not directly tag based.)
Application must work with stock browser, container, and ICEmobile-SX.
ICEmobile-SX will cause a page reload when a device feature completes,
so the "current page" must be stored in the session.
When the user requests a "Reality" view, the tag-matching messages
will have thumbnails shown (this is implemented now for all messages,
so adding the tag feature will take care of the rest).
The purpose of the tag feature is to allow people to create their
own areas of interest within the application "restaurant", "park",
"javaone2012".
Use case:
- tag with "restaurant"
- take a photo of restaurant sign
- include a video clip of the happy kitchen staff and your food
- write a short review
- press submit
- another user types in the tag "restaurant"
- they see restaurants in the AR view
- they select one
- after reading reviews, follow the AR view to find the restaurant
(We will likely model a JavaOne contest around the above use case
somehow.)
So, it is really not too far from the current mediacast demo, but
will allow some very interesting use cases.
Ted.
device integration features (augmented reality) and serve as
a JavaOne demo, possibly as part of a contest.
The main difference is actually with the "message" concept
in the application. Currently a message is just a photo,
video, or audio clip with a title and some thumbnails. In
the new version:
A media message will contain the following
- title (auto-generated if not provided)
- photo (optional)
- video (optional)
- audio (optional)
- location and direction (optional)
- text paragraph (optional)
- tags (optional)
If a photo is not provided, a frame from the video will be used
(as we generate video thumbnails now).
The user has a current set of tags (space separated in a text field).
They will see only messages with those tags and will create messages
with those tags. (Possible feature: some tags are reserved, such
as <1km to provide filtering features not directly tag based.)
Application must work with stock browser, container, and ICEmobile-SX.
ICEmobile-SX will cause a page reload when a device feature completes,
so the "current page" must be stored in the session.
When the user requests a "Reality" view, the tag-matching messages
will have thumbnails shown (this is implemented now for all messages,
so adding the tag feature will take care of the rest).
The purpose of the tag feature is to allow people to create their
own areas of interest within the application "restaurant", "park",
"javaone2012".
Use case:
- tag with "restaurant"
- take a photo of restaurant sign
- include a video clip of the happy kitchen staff and your food
- write a short review
- press submit
- another user types in the tag "restaurant"
- they see restaurants in the AR view
- they select one
- after reading reviews, follow the AR view to find the restaurant
(We will likely model a JavaOne contest around the above use case
somehow.)
So, it is really not too far from the current mediacast demo, but
will allow some very interesting use cases.
Ted.
Activity
- All
- Comments
- History
- Activity
- Remote Attachments
- Subversion