Wikidocumentary is is a layered, time-based presentation that comprises of simple elements, and it can be viewed in a few different ways.
- 1 What would a wikidocumentary look like?
- 2 Elements
- 3 Different editor views
- 4 Editor actions
- 5 Map-based wikidocumentary -- This has not been cleaned from outdated information
- 6 Old graphics
- 7 Navigation
What would a wikidocumentary look like?
- Web interface, page layout: Scrolling or clicking between frames. Maybe the user can change the layout and mode?
- Sit back cinema (auto-advance), full-screen, timed
- Mobile orienteering mode (random access). Frames are located in places where it's possible to enter. There must be flexibility in the timelines to allow for different transition times. Use soundscapes and other media loops, trigger frames by proximity etc.
- Different screen estate
- Possibility to easily record images, sound, video or location
- Possibility to connect to other contributors or share
- Things that the user can be asked to do: Take pictures, record a fact, record audio...
- Wikidocumentaries as gradually evolving presentations: Users can add an aspect to an existing wikidocumentary.
Frames are the building blocks of a wikidocumentary.
- Image or video. Later, multiple media files and an image sequence may be made possible.
- Frame audio can be a soundscape or a loop / generative sound to be played while user is in the frame. It can be any sound that plays only once.
- Frame title and text. Possibly rich text with more images. Links.
- Location. Each frame can be depicted by a coordinate location. This can be used in different ways. It can be plotted on a map or it is a location to go to to experience AR content.
- Map, with on opportunity to change the style
- Background overlay layer Georeferenced imagery can be used as a background layer. It can be an old map, aerial image, aerial video, or alternatively a large zoomable image. It can also be a video overlay.
- Data overlay can be geographic data plotted on the map or annotations on an image or a map. A data point in a time series can be the basis for a frame.
- Audio track. The audio track can be a narrative, a musical piece, or an ambient recording for example. Frames are anchored on the timeline. It can be an audio file or the audio track of a video file, or the audio can be recorded in the app.
- Time series data (Wikidata query, file). Examples include travel route, any animated thematic, statistical map.
- Transitions: Soundscapes, specific content. Data animations run and set the pace of the interval.
Different editor views
When the starting material is a recording, all frames have to be placed on the timeline. The user can change the position of the frame on the timeline and enter the frame editor for more details.
If the base material is a text or an article, the text editor is best for that use. Wikipedia articles could be automatically sliced into frames using AI technology that is used in some presentation program (which one?)
When the starting point is a collection of images or Wikidata items, the collection editor can be used to arrange them in time. The user may enter any of the frames for more detail.
In this editor view the user can control all the individual elements within the frame. The user may add a soundtrack, and then it needs to be edited in the timeline view.
The user may record an audio track to the whole documentary, or frame audio for a single frame.
New frame / Edit frame / Add image / Search image / Add sound / Search sound / (Adjust frame sound) / Add video / Adjust video
Map-based wikidocumentary -- This has not been cleaned from outdated information
Transferred from Map
Based on frames that consist of several parameters
- Coordinate location
- View angle
- Time value
- Visual media
- Image plane and view angle for overlays (on the map background, in 3D space)
- Medium (video, image, image series, image pair)
- Headline & subheader
- Fixed timeline throughout presentation, such as a narrative
- Fixed timeline over one frame
- Soundscape loops or generative soundscapes
- Effects, synchronized sound
- Map settings
- Map style
- Timeline display
- Map is the base element
- Each frame represents a point from which the map is seen
- Map settings are different for each map frame
- An aerial image/video/illustration may overlay the map
- A media element may be added to the 3D view
- An image pair can be viewed with a revealing slider
- Map remains controllable in frames
- Text can be scrollable including parallax elements.
- Media can be used as part of layout in the text stream. Media can be expanded to cover the full screen.
- Transition from frame to the other may also different, such as a crossfade between media elements.
- Frame display triggered by GPS info
- Dynamic output
- soundscape, colours, alternative content, nearby geotagged content
- depending on data
- proximity to a coordinate location
- weather data
- time of day, time of year...
- Interface to create frames and configure settings
- AI-assisted frame generation from Wikipedia articles
- User-enhanced in the same interface
- Animations of time-series, how do they relate to frames? Can frames be marked as time intervals, or points in time?
- Can the wikidocumentaries be multidimensional, including for example temporal depth across its own timeline.
- The map background is used in navigating between frames. Frames can be accessed in random order.
- The frames can display AR content. Ideas?
|About||Technology||Design||Content modules||Tool pages||Projects|
|Setting up dev environment||
|Visual editor||Central Park Archives|