https://www.sirikata.com/wiki/api.php?action=feedcontributions&user=Rryk&feedformat=atomSirikata Wiki - User contributions [en]2024-03-29T02:01:01ZUser contributionsMediaWiki 1.35.7https://www.sirikata.com/wiki/index.php?title=BuildTheCode&diff=955BuildTheCode2011-05-19T08:20:16Z<p>Rryk: /* Without Cygwin (manual) */</p>
<hr />
<div>= Dependencies =<br />
<br />
Sirikata depends on quite a bit of external libraries. [[Dependencies]] gives a list of required and optional libraries and a brief description of how they are used.<br />
<br />
When building Sirikata you have two options: handle installing dependencies yourself or use our install scripts or precompiled binaries (depending on platform and library) to help you build and install the required libraries. If you want to perform the installation or need to write new scripts for a new platform, see [[Dependencies]] for instructions on how to do so.<br />
<br />
These instructions explain how to use our install script or precompiled binaries.<br />
<br />
{{note}} Some dependency files are quite large, especially the precompiled binaries. When you run the install commands, it may appear the script has hung, but its likely just checking out the packages.<br />
<br />
{{note}} Throughout we'll be using the standard set of dependencies, which includes support for graphics, embedded browsers, scripting languages, and physics simulation via Bullet. If you don't need all these features, for example because you are only running the space server, you can use a different dependency target. Where <tt>depends</tt> is used, you can replace the following to get a different set of dependencies:<br />
* <tt>minimal-depends</tt> - the minimal dependencies required to get the system building<br />
* <tt>minimal-graphics-depends</tt> - the minimal dependencies required to get 3D graphics working, i.e. to get a client running. Note that not all features of the client will be enabled by this<br />
* <tt>full-depends</tt> - all dependencies, enabling all features of the system, and enables use of root to install system packages<br />
<br />
== CMake ==<br />
<br />
We use CMake on all platforms to check for dependencies and generate a build script. The Linux dependencies will install CMake from your system's package manager if you allow it to use root. Otherwise and for all Mac and Windows platforms, CMake is not installed automatically. You should use the install tool from [http://www.cmake.org the CMake website] or install it using your system's package manager. CMake 2.6 or higher is required.<br />
<br />
{{note}} On OS X, when the installer asks if you want to install the command-line version of the tools, say '''yes'''. While our build system can be run through the GUI, some dependencies also use CMake and you do not want to have to perform those install steps manually.<br />
<br />
== Subversion ==<br />
<br />
Because of their size, dependencies are stored in subversion repositories. If you want to use our automated dependency scripts, you'll need Subversion installed. It is installed on Mac by default, Linux users can use their distributions system package, and Windows users can either install it via Cygwin or use [http://tortoisesvn.tigris.org/ TortoiseSVN].<br />
<br />
== Simple Method: Makefile ==<br />
On all platforms, if you have the right tools (make and subversion, on Windows provided by Cygwin) a Makefile is provided in the root directory which checks out and either builds or extracts dependencies to <tt>sirikata/dependencies/</tt>:<br />
<br />
cd sirikata/<br />
make depends<br />
<br />
== Windows Notes ==<br />
The simple method requires Cygwin. All dependencies are installed locally in the Sirikata checkout. No root access is needed.<br />
<br />
{{note}} The Windows dependencies support Visual Studio 2008 and later, including Express editions. Visual Studio 2005 and earlier are no longer supported.<br />
<br />
=== Without Cygwin (manual) ===<br />
If you do not have Cygwin and do not want to install it, you can perform the steps manually. Using a subversion client, checkout<br />
http://sirikatawin32.googlecode.com/svn/trunk/ <br />
to <tt>sirikata/dependencies/</tt>.<br />
<br />
Go into the dependencies folder and unzip all of the packages directly into the dependencies directory. Make sure to select "Extract Here". If that option is not available, remove the name of the zip file from the end to indicate where to extract, so it just ends with "dependencies". The extracted directories should not have the same name as the zip files.<br />
<br />
Then got to the top-level directory and run these two commands:<br />
git submodule init<br />
git submodule update<br />
<br />
== Mac ==<br />
<br />
=== Snow Leopard ===<br />
<br />
With the release of Apple's Snow Leopard (10.6) operating system comes the opportunity to have software compiled for two different architectures running on the same machine. These are the 32-bit i386 and the 64-bit x86_64 (k8). Though the Intel hardware has always had the capability to run in a 64 bit address space prior to 10.6, the operating system didn't support it. Unfortunately, the default architecture is 64, but not all libraries and dependencies are 64-bit savvy, so it is best to compile for the 32 bit architecture.<br />
<br />
===== XCode and Developer Tools =====<br />
<br />
You need '''XCode Tools 3.2.x''' to build Sirikata. You can install these in parallel to XCode 4.0 if you already have that version. In particular, you need gcc-4.0 and g++-4.0. Version 4.2 won't work.<br />
<br />
You can either download XCode at [http://developer.apple.com/mac] or use the Snow Leopard DVD: under “Optional Installs”, install “Xcode.mpkg” to get 3.2. Use all default options. We highly recommend at least 3.2.2 as previous versions had issues with hanging.<br />
<br />
{{note}} If you already installed XCode 4, then gcc and g++ were upgraded to version 4.2. The simplest course of action is to (re)install XCode 3.2 in parallel from [http://connect.apple.com/cgi-bin/WebObjects/MemberSite.woa/wo/5.1.17.2.1.3.3.1.0.1.1.0.3.3.3.3.1 here] -- the setup should work fine next to XCode 4. You might be able to get by just by copying the old files (gcc-4.0 and g++-4.0) from /Developer-old/usr/bin back to /Developer/usr/bin, but this probably won't work for Snow Leopard (OS X 10.6).<br />
<br />
You may get errors of the following form: "CMake Error: The following variables are used in this project, but they are set to NOTFOUND." For the Boost_INCLUDE_DIR variable, you can edit CMakeCache.txt to set the variable to the directory containing the Boost header files. It should be something like sirikata/dependencies/boost_1_45_0/boost.<br />
<br />
== Linux ==<br />
The install script expects an Ubuntu system, 8.04 or greater.<br />
<br />
If you want to install the system package dependencies directly, issue the following command:<br />
<br />
sudo apt-get install \<br />
git-core cmake sed unzip zip automake1.9 nvidia-cg-toolkit jam g++ \<br />
libzzip-dev libxt-dev libxaw7-dev libxxf86vm-dev libxrandr-dev libfreetype6-dev \<br />
libxext-dev autoconf libtool libpcre3-dev flex bison patch libbz2-dev gawk \<br />
libglu1-mesa-dev tofrodos libglut3-dev freeglut3-dev scons libexpat1-dev \<br />
libgtk2.0-dev libnss3-dev libgconf2-dev gperf libasound2-dev subversion \<br />
libtool autoconf ruby flex libgsl0-dev libssl-dev libspeex-dev libxss-dev \<br />
libdbus-glib-1-dev libgnome-keyring-dev libxml2-dev libjpeg62-dev libcups2-dev \<br />
wget libcurl4-gnutls-dev<br />
<br />
These packages should cover everything you need for the entire system, including the graphical client and all scripting plugins.<br />
<br />
= Building Sirikata =<br />
<br />
We use CMake to generate our build scripts. Make is used on Mac and Linux and Visual Studio is used on Windows to perform the actual build. All three builds follow the same basic steps:<br />
* Run cmake, possibly modifying the configuration.<br />
* Run your build tool.<br />
<br />
== Windows ==<br />
<br />
Start up CMake and point both paths to sirikata/build/cmake. Hit configure twice. If you installed any dependencies in non-standard locations, point CMake to them now. To generate the build files, hit Generate.<br />
On Windows Vista or later you might want to change CMAKE_INSTALL_PREFIX to the directory where you want to install your compiled binaries and hit configure again, because by default CMake will generate Visual Studio project that will try to install compiled binaries into %PROGRAMFILES%. This requires administrative privileges, which is why build will fail (unless you run Visual Studio with administrative privileges or have disabled UAC).<br />
<br />
Now browse to sirikata/build/cmake. Open Sirikata.sln and run Build All.<br />
<br />
This should result in all libraries, plugins, and binaries in the sirikata/build/cmake/debug or sirikata/build/cmake release, depending on which configuration you built.<br />
<br />
=== Notes ===<br />
* There aren't standard locations to search for dependencies on Windows. Obviously we've setup the build system to work cleanly with the dependency install script. The easiest way to get a manual installation of dependencies on Windows to work is to use the same layout as the install script, where all dependencies are located in sirikata/dependencies. If you don't do this, you will almost certainly need to manually specify the locations of some libraries in CMake.<br />
* The build is known to work for VS2008, for both regular and Express versions.<br />
* If you get error 0xc0000022, check the permissions of the sirikata top-level directory. If you used Cygwin's version of git, it will remove execute permissions for security purposes. You may enable execute by doing "chmod -R +x sirikata" in cygwin, or Granting yourself Full Control in Right Click->Properties->Security (make sure to click the Replace All Entries in Child Objects checkbox).<br />
<br />
==== DLL Path under windows ====<br />
In order to get Sirikata to find the required DLLs, you must copy all DLL files from these directories into the "build/cmake" directory:<br />
* dependencies\boost_1_37_0\lib<br />
* dependencies\installed-curl<br />
* dependencies\ogre-1.6.1\bin<br />
* dependencies\SDL-1.3.0\bin<br />
* dependencies\protobufs\bin<br />
<br />
Note that Sirikata currently does not work in Windows 2000, due to the lack of a few raw input functions (RegisterRawInputDevice) that SDL 1.3 uses. This will hopefully be fixed in SDL at some point to make it use the old DirectX raw input system.<br />
<br />
== Mac and Linux ==<br />
<br />
For convenience we provide a top level makefile which performs the standard build operations. If you want a default build and have used the install script for dependencies, do:<br />
<br />
cd sirikata/<br />
make<br />
<br />
If you'd like to run the build manually, do the following (essentially what is in the makefile):<br />
<br />
cd sirikata/build/cmake<br />
cmake . [-DCMAKE_BUILD_TYPE=Debug|Release]<br />
make<br />
<br />
To interactively adjust settings, for instance to point CMake to a different version of libraries:<br />
<br />
cd sirikata/build/cmake<br />
ccmake .<br />
<br />
When the build completes you should have the libraries, plugins, and binaries in sirikata/build/cmake.<br />
<br />
=== Additional Tools ===<br />
<br />
If you're planning on developing Sirikata, it can be helpful to use ccache or a distributed compiler tool like distcc. These tools speed up compilation, either avoiding duplicate work after running make clean or by distributed compile jobs across the network. We provide a wrapper script that tries to use these automatically. Instead of the steps above, use this script:<br />
<br />
cd sirikata/build/cmake<br />
rm CMakeCache.txt # We need to clear out old settings if you already ran CMake<br />
./cmake_with_tools.sh .<br />
<br />
The script passes other settings along to CMake, so you can add any parameters you would normally pass to CMake to this command as well.<br />
<br />
=== Mac XCode Project ===<br />
<br />
Many mac developers prefer to use the XCode programming environment.<br />
To do this, cmake will need to be configured to use XCode<br />
<br />
Steps are:<br />
<br />
1. test -d build/cmake/Debug || mkdir build/cmake/Debug<br />
1. The directory for XCode build products.<br />
2. python csv_converter.py build/cmake/Debug/scene.csv scene.db<br />
1. Generate a scene for use by the XCode project.<br />
3. cd build/cmake<br />
4. ln -s ../../dependencies/Frameworks Frameworks<br />
1. To fix brittle paths in Xcode project.<br />
5. cmake -G Xcode .<br />
6. cd Debug<br />
7. ln -s ../Sirikata.Protocol.dll Sirikata.Protocol.dll<br />
8. ln -s ../Sirikata.Runtime.dll Sirikata.Runtime.dll<br />
9. cd ..<br />
10. Launch Xcode<br />
11. Open Sirikata.xcodeproj<br />
12. Edit project settings<br />
1. General tab<br />
2. Choose project root<br />
3. Select build/cmake directory. Click OK. Value should just be <Project File Directory>.<br />
1. This fixes a problem that prevents debugger from working.<br />
13. Project menu<br />
1. Select active target ALL_BUILD<br />
2. Select active project executable cppoh<br />
3. Select active build configuration Debug<br />
14. Close XCode to save the project<br />
15. Reopen Sirikata.xcodeproj<br />
16. Build it<br />
17. Debugger<br />
1. Confirm that you can set a break point.<br />
2. Run debugger<br />
3. Confirm that it stops at a break point.<br />
4. Confirm that it displays source code.<br />
5. (trouble shoot back to step 10)<br />
<br />
= Running Sirikata =<br />
Assuming the previous steps have completed successfully, you should have a few binaries built and you are ready to [[Running Sirikata|run Sirikata]].</div>Rrykhttps://www.sirikata.com/wiki/index.php?title=JavascriptGraphicsAPI&diff=705JavascriptGraphicsAPI2010-12-10T12:54:38Z<p>Rryk: /* Mouse Events */</p>
<hr />
<div>=Rationale for a common API to 3d graphics systems=<br />
<br />
Objects are sent across the thread barrier to alter the current scene graph being displayed--here's why:<br />
<br />
In modern engines, graphics framerates should not be tied to physics framerates and networking events and decoding of said events should happen at the correct pace to keep up with the networknig adapter. Graphics, however, is tied to the DOM and therefore must be on the main thread. This forces both networking and physics to be on webworker threads if there is to be any threading.<br />
<br />
These threads need to advertise state changes to the main graphics thread so that the scene graph may be altered at the graphics rate. This requires that the individual physics and networking threads send timestamped events to the graphics system which drive changes to it. <br />
<br />
Since graphics can run at different rates and the updates from the network may be irregular, the graphics (main) thread needs to have a smooth interpolation scheme interpolating the current position with the timestamped updates sent by the network/physics thread(s). The interpolation scheme should use cubic interpolation using the most current update's position and orientation along with the displayed position and location of the object when that update was received to provide a smooth scheme.<br />
<br />
Below are some example objects that may be sent cross-thread. The objects are listed in JSON format so the type information should be clear from the example.<br />
<br />
=API To Graphics System=<br />
Graphics should provide a constructor method that takes in a callback and a parent DOM element and returns a class that has a send(obj) method that takes in serializable objects from other threads that modify graphics state and an optional "destroy" method which cleans up graphics state in the DOM.<br />
<br />
so a sequence of code to construct an graphics system, make an object, and destroy it could look like<br />
gfx= new GLGERenderer(callbackFunction,parentElement)<br />
gfx.send({ msg:"Create", id:"f47ac10b-58cc-4372-a567-0e02b2c3d479", time: 2181298451298491284, pos:[1,2,3], orient:[.5,0,0,.5]})<br />
gfx.destroy();<br />
<br />
=Cross thread communication from Physics and Networking to graphics=<br />
==Object Management==<br />
id's can be anything from human readable strings to uuids to integers. They just each must be unique and chosen by the user of the API<br />
===Creating a new graphics object===<br />
<br />
{<br />
msg:"Create"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
spaceid:"aaaaaaaa-bbbb-cccc-dada-134234ab98",//<-- optional (defaults to the empty space, 0)<br />
time: 2181298451298491284,//milliseconds since 1970<br />
pos:[1,2,3],<br />
vel:[.25,0,0],<br />
orient:[.5,0,0,.5]<br />
rotaxis:[0,0,1]<br />
rotvel:.25,<br />
scale:[1,1,1],//defaults to 1,1,1 if absent<br />
parent:"c46ac00b-58cc-4372-a567-0e02b2c3d479",//<-- optional (defaults to empty--toplevel if absent<br />
parentbone:"Hand"//name of the bone on the parent object that this is attached to. Assume root transform otherwise<br />
}<br />
<br />
===Moving a graphics object===<br />
//should we define that the graphics system has some sort of interp--otherwise velocity may be useless?<br />
<br />
Should we use "parentbone" or "attachment_point"?<br />
<br />
FIXME: We should default these values to the last position, not to identity. Otherwise we basically have to send everything each time.<br />
<br />
{<br />
msg:"Move"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479"<br />
time:39852398592385,//milliseconds since 1970<br />
pos:[1,2,3],<br />
vel:[.25,0,0],<br />
orient:[.5,0,0,.5],//defaults to (identity) if absent<br />
rotaxis:[0,0,1],//defaults to 0,0,1 if absent, forcing rotvel to 0 <br />
rotvel:.25,//defaults to 0 if absent<br />
scale:[1,1,1],//defaults to 1,1,1 if absent<br />
interpolate:true,//set to false if the object should snap to new position<br />
parent:"c46ac00b-58cc-4372-a567-0e02b2c3d479"//<-- optional (defaults to previous state if absent, to clear pass empty string)<br />
attachment_point:"Hand"//name of the bone on the parent object that this is attached to. Defaults to previous state if absent, to clear pass empty string<br />
}<br />
<br />
===Destroying a graphics object===<br />
<br />
{<br />
msg:"Destroy"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479"<br />
}<br />
<br />
<br />
==Managing object appearance properties==<br />
===Adding/changing mesh property for an object===<br />
{<br />
msg:"Mesh",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
type:"collada",//string that specifies file format<br />
mesh:"http://example.com/test.dae",//the mesh should be rescaled to fit inside a unit sphere centered at 0,0,0<br />
}<br />
<br />
===Updating shader property (Vertex and Fragment float4) for an object===<br />
{<br />
msg:"MeshShaderUniform"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
name:["ColorTint","HowManyIterations"]<br />
value:[[.24,.24,.25,1.0],[1,0,0,0]]<br />
type:["float4", "int4"]<br />
}<br />
<br />
===Removing mesh property for an object===<br />
{<br />
msg:"DestroyMesh",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
}<br />
<br />
===Adding/changing light property for an object===<br />
<br />
{<br />
msg:"Light"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479"<br />
diffuse_color:[.25,.5,1],<br />
specular_color: [.2,1,.5],<br />
power=1.0: //exponent on the light<br />
ambient_color: [0,0,0],<br />
light_range: 1.0e5<br />
constant_falloff: 0.5,<br />
linear_falloff: 0.2,<br />
quadratic_falloff: 0.1,<br />
cone_inner_radians: 0,<br />
cone_outer_radians: 0,<br />
cone_falloff: 0.5,<br />
type: "POINT",//options include "SPOTLIGHT" or "DIRECTIONAL"<br />
casts_shadow: true,<br />
shader:"http://www.example.com/pointLight.shader"// light shader for ray tracing (empty by default)<br />
}<br />
<br />
===Removing light property for an object===<br />
{<br />
msg:"DestroyLight"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479"<br />
}<br />
<br />
==Camera Management==<br />
<br />
===Creating camera properties on an object===<br />
{<br />
msg:"Camera"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479"<br />
}<br />
<br />
===Destroying and cleaning up a camera===<br />
{<br />
msg:"DestroyCamera"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479"<br />
}<br />
<br />
===Attach a camera to an object's texture"===<br />
{<br />
msg:AttachCamera",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479", // Camera object's id<br />
texobjid:"9a10e9c1-31fb-43e8-9a20-6545d9a62fdb", // Id of object with a mesh<br />
texname:"example.png"//overwrites this texture on the texobjid object.<br />
}<br />
===Attach a camera to a render target"===<br />
{<br />
msg:AttachCamera",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479", // Camera object's id<br />
target:0//writes to this framebuffer--- 0 for left ,1 right for stereo, etc.<br />
}<br />
===Detach a camera from its render target"===<br />
{<br />
msg:DetachCamera",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479", // Camera object's id<br />
}<br />
<br />
==Skeleton Management==<br />
<br />
===Streaming some joint locations===<br />
{<br />
msg:"AnimateBone",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
animation:"uniqueAnimationIdentifier",//so this movement can be associated with one animation and blended with others<br />
//if not specified this is a hard constraint (i.e. foot is glued to a wall in order to avoid penetrating it)<br />
weight:1.0,//the weight for prospective blending, defaults to 1.0<br />
time:1250120951209510295;//milliseconds since 1970<br />
bone:["ankle","arm"]<br />
pos:[[1,2,3],[2,3,4]]<br />
vel:[[.25,0,0],[0,0,0]]<br />
orient:[[.5,0,0,.5],[1,0,0,0]]<br />
rotaxis:[[0,0,1],[0,1,0]]<br />
rotvel:[.25,0],<br />
interpolate:true//if false then the bone should snap to the location unless smooth is set (in which case it should interpolate as quickly as possible) defaults to true<br />
}<br />
<br />
==Event handling==<br />
<br />
===Mouse Events===<br />
<br />
Messages <u>from the graphics system</u> for standard browser events:<br />
{<br />
msg:"mousemove",<br />
x:100,//X coordinate of the mouse, relative to canvas<br />
y:102,//Y coordinate of the mouse, relative to canvas<br />
}<br />
<br />
{<br />
msg:"mousedown",<br />
which:0,//right mouse button = 2, left mouse button = 0<br />
x:100,//X coordinate of the mouse, relative to canvas<br />
y:102,//Y coordinate of the mouse, relative to canvas<br />
}<br />
<br />
{<br />
msg:"mouseup",<br />
which:2,//right mouse button = 2, left mouse button = 0<br />
x:100,//X coordinate of the mouse, relative to canvas<br />
y:102,//Y coordinate of the mouse, relative to canvas<br />
}<br />
<br />
{<br />
msg:"mouseover",<br />
x:100,//X coordinate of the mouse, relative to canvas<br />
y:102,//Y coordinate of the mouse, relative to canvas<br />
}<br />
<br />
{<br />
msg:"mouseout",<br />
x:100,//X coordinate of the mouse, relative to canvas<br />
y:102,//Y coordinate of the mouse, relative to canvas<br />
}<br />
<br />
Despite the fact that click can be derived from mousedown and mouseup, we keep it for consistency with the Web.<br />
{<br />
msg:"click",<br />
which:0,//right mouse button = 2, left mouse button = 0<br />
x:100,//X coordinate of the mouse, relative to canvas<br />
y:102,//Y coordinate of the mouse, relative to canvas<br />
}<br />
<br />
As an alternative to mousemove we introduce pick message that will return additional 3D data:<br />
{<br />
msg:"pick",<br />
x:100,//X coordinate of the mouse, relative to canvas<br />
y:102,//Y coordinate of the mouse, relative to canvas<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",//which object was hit?<br />
pos:[1,2,3],//where on the surface did it hit?<br />
normal:[.5,0,.86],//what the direction of the normal is at that point<br />
}<br />
<br />
To save traffic we allow messages <u>to the graphics system</u> that enable/disable these messages:<br />
{<br />
msg:"enable",<br />
type:"mousemove"//message type<br />
}<br />
<br />
{<br />
msg:"disable",<br />
type:"pick"//message type<br />
}<br />
<br />
More advanced messages, such as pickover, pickout, drag and others can be derived from the messages described above and we should keep this list as simple as possible. For example, pickover and pickout can be derived from mouseover and pick messages (when object ID under cursor changes than it's a pickover for the new object and pickout for the old one). Moreover, we can use mouseover, mouseout to enable/disable picking to reduce inter-thread traffic.<br />
<br />
===Keyboard===<br />
<br />
This messages are sent <u>from the graphics system</u> in response to user's actions towards keyboard:<br />
{<br />
msg: "keydown"//key was pressed, but not released<br />
altKey:true,<br />
metaKey:false,<br />
ctrlKey:true,<br />
repeat:false<br />
shiftKey:true<br />
which:25//see keyCode<br />
}<br />
<br />
{<br />
msg: "keyup",//key was released<br />
altKey:true,<br />
metaKey:false,<br />
ctrlKey:true,<br />
repeat:false<br />
shiftKey:true<br />
which:25//see keyCode<br />
}<br />
<br />
For consistency with the web we also allow keypress, despite the fact that it can be derived from keydown and keyup:<br />
{<br />
msg: "keypress",<br />
altKey:true,<br />
metaKey:false,<br />
ctrlKey:true,<br />
repeat:false,//we send repeated message if user holds a key, every message except first one will have this property set to true<br />
shiftKey:true,<br />
which:25//see keyCode<br />
}<br />
<br />
===Requesting intersection===<br />
<br />
Request <u>to the graphics system</u>:<br />
{<br />
msg:"raytrace",<br />
id:5,//request ID<br />
pos:[2,3,4],//origin of the ray<br />
dir:[.24,.33,.5],//direction of the ray<br />
multiple:true//if false, only return first hit, otherwise return all intersections<br />
infinite:false//if false use length of dir to specify ray length<br />
}<br />
<br />
Response <u>from the graphics system</u>:<br />
{<br />
msg:"intersections",<br />
id:5,//request ID<br />
pos:[[2,3,4],[2.23,3.32,4.49]],//positions of the points of intersections<br />
normals:[[0,1,0],[.5,0,.86]],//normals of the surface at intersections points<br />
id:["f47ac10b-58cc-4372-a567-0e02b2c3d479","a33ff133-58dd-2272-dd6a-12aadc31d173"],//object IDs for each intersected surface<br />
}<br />
<br />
=Experimental/Brainstorming ideas for the API=<br />
I decided to reserve a section of the wiki for sort of bleeding edge ideas of cool features that would be nice to have. I could have put that in the "talk" page, but I think it makes more sense here so that it will get wider exposure. These are meant to be things that would help in drawing real scenes and building real VW systems but that we haven't figured out a good API to yet.<br />
<br />
==Attaching UI elements to graphics objects==<br />
The UI will naturally need to be in HTML since that's the best established cross platform, sandboxed UI system.<br />
<br />
The user may specify a 3d location, orientation and scale for a UI dialog to be. The graphics system should do its best to scale and position the UI in the appropriate place, but the UI may be restricted to always face the camera and always be horizontal compared with the bottom of the screen on many system. The UI should not be displayed if it is completely invisible from the camera angle or smaller than 10 pixels.<br />
===Creating/Updating UI Element===<br />
{<br />
msg:"IFrame"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
uri: "http://example.com"<br />
}<br />
<br />
===Destroying UI Element===<br />
{<br />
msg:"DestroyIFrame"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
}<br />
<br />
<br />
It seems like there should be a manner aside from "embedded iframes" to get art defined in the DOM into the scene graph--perhaps the canvas tag is the way to go here? But maybe that's too webGL specific and won't work for an Ogre port of this<br />
<br />
==Attaching 3d Text to an Objects==<br />
I'm just brainstorming here: it seems like WebGL has facilities to do this efficiently, but I don't have a good use case except buildnig a rendering system inside a canvas tag or something?<br />
<br />
Perhaps the canvas tag is the way to go <br />
<br />
{<br />
msg:"Text",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
text:"This is a test of the emergency broadcast system",<br />
font:"size=+1"<br />
}<br />
<br />
{<br />
msg:"DestroyText",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
}<br />
<br />
==Particle System==<br />
===Adding a particle system to an object===<br />
This mimics the ogre interface and we introduce a number of billboard types<br />
point<br />
The default arrangement, this approximates spherical particles and the billboards always fully face the camera. <br />
oriented_common<br />
Particles are oriented around a common, typically fixed direction vector (see common_direction), which acts as their local Y axis. The billboard rotates only around this axis, giving the particle some sense of direction. Good for rainstorms, starfields etc where the particles will traveling in one direction - this is slightly faster than oriented_self (see below). <br />
oriented_self<br />
Particles are oriented around their own direction vector, which acts as their local Y axis. As the particle changes direction, so the billboard reorients itself to face this way. Good for laser fire, fireworks and other 'streaky' particles that should look like they are traveling in their own direction. <br />
perpendicular_common<br />
Particles are perpendicular to a common, typically fixed direction vector (see common_direction), which acts as their local Z axis, and their local Y axis coplanar with common direction and the common up vector (see common_up_vector). The billboard never rotates to face the camera, you might use double-side material to ensure particles never culled by back-facing. Good for aureolas, rings etc where the particles will perpendicular to the ground - this is slightly faster than perpendicular_self (see below). <br />
perpendicular_self<br />
Particles are perpendicular to their own direction vector, which acts as their local Z axis, and their local Y axis coplanar with their own direction vector and the common up vector (see common_up_vector). The billboard never rotates to face the camera, you might use double-side mater<br />
For further documentation about the properties see<br />
http://www.ogre3d.org/docs/manual/manual_35.html#SEC191<br />
http://www.ogre3d.org/docs/manual/manual_36.html#SEC208<br />
and<br />
<br />
<br />
{ <br />
msg:"ParticleSystem",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479"<br />
mesh:"http://example.com/billboard.dae"//the mesh should be rescaled to be a 1x1 mesh with <br />
particle_size:[20,20],<br />
cull_each:false<br />
quota:10000<br />
billboard:"oriented_self",<br />
sorted:false//defaults to false--whether the particles should be sorted<br />
local:false//defaults to false--if true rotation of the node after the emission of the particle will rotate it<br />
direction: [0,0,1],///the common direction for oriented_common or perpendicular_common<br />
up: [0,0,1],///Only required if billboard_type is set to perpendicular_self or perpendicular_common, this vector is the common up vector used to orient all particles in the system.<br />
accurate_facing:false//if the facing is set to the camera facing or calculated per billboard<br />
iteration_interval:.125//how often the particles are updated--if set to 0, defaults to framerate<br />
invisibility_timeout:10//how many seconds of being outside the frustum before the system stops updating<br />
}<br />
<br />
{<br />
msg:"DestroyParticleSystem"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479"<br />
}<br />
<br />
<br />
once a system is created, particles need to be emitted from it. There should be a global map of default emitters named ParticleEmitters consisting of at least<br />
"Point","Box","Cylinder","Ellipsoid","Shell","Ring" and the extra attributes are specified in http://www.ogre3d.org/docs/manual/manual_38.html<br />
{<br />
msg:"ParticleEmitter",//add or upate a particle emitter<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
name:"Flare"<br />
type:"Ring"<br />
angle 15<br />
emission_rate 75<br />
time_to_live:[2.5,3]//range between 2.5 and 3<br />
direction [0, 1, 0]//3d vector<br />
speed:[250,300]//range between 250 and 300<br />
colour_range:[[1 0 0],[0 0 1]]//random color<br />
position:[0,0,0],<br />
repeat_delay:[2.5,5]<br />
<br />
}<br />
<br />
{<br />
msg:"RemoveParticleEmitter",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
name:"Flare"<br />
}<br />
<br />
There may be forces applied to the emitters<br />
and there must be a global map of affectors called ParticleAffector from which the relevent affector is selected consisting of at least<br />
<br />
LinearForce, ColourFader, Scaler, Rotator, ColourInterpolator, ColourImage, DeflectorPlane, DirectionRandomiser<br />
The detailed definitions are contanied at http://www.ogre3d.org/docs/manual/manual_40.html#SEC234<br />
<br />
<br />
<br />
<br />
{<br />
msg:"ParticleAffector",//add or upate a particle emitter<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
name:"TheForce"<br />
type:"LinearForce"<br />
force_vector: [0 -100 0]<br />
force_application: "add"<br />
<br />
}<br />
<br />
{<br />
msg:"RemoveParticleAffector",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
name:"TheForce"<br />
}<br />
<br />
==Specifying a terrain for the world==<br />
Would need to be chunked and in some sort of widely readable format---would be nice to be able to tap into google earth for terrain--ideas for how to do this are still very very early<br />
<br />
=Deprecated API Ideas=<br />
Here we put ideas we had but decided to discard so that they don't come up again as new ideas and may be discussed here and evaluated for re-addition if someone feels strongly they should be included<br />
<br />
<br />
==Skeleton file formats==<br />
<br />
The reason these were removed is that they are too brittle (it's hard to weigh an wave and walk animation and have the steps not be half as wide) and it's difficult to keep the skeletons out of trouble (i.e. feet through the ground)<br />
so we think thta the physics system in general should send the bone positions and timestamps since it's the arbiter of what intersects what--and it can always read the skeleton file format.<br />
<br />
===Animating a skeleton based on a time based animation===<br />
{<br />
msg:"Ani",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
time:489192048120984102,///milliseconds since 1970 that the animation should be started from (skip frames if now is later)<br />
animation:"http://example.com/animation.dae",<br />
loop:false,<br />
weight:1.0 ///how strong this animation should compare with other animations that use the same bones<br />
fadein:2.3 //how many seconds to fade in<br />
}<br />
<br />
Note that the animation.dae should have annotations for loop-in point and loop-out point within the .dae so that loop can intelligently function<br />
===Stopping a skeleton based on a time based animation===<br />
{<br />
msg:"AniStop",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
animation:"http://example.com/animation.dae"<br />
fadeout:1.0//how many seconds to fade out<br />
}<br />
<br />
<br />
==Should an object just be a sprite==<br />
We figured that a collada square file may be a more compat representation for a sprite and can contain the appropriate shader, materials, etc<br />
===Making an object a point sprite===<br />
This mimics the ogre interface and we introduce a number of billboard types<br />
point<br />
The default arrangement, this approximates spherical particles and the billboards always fully face the camera. <br />
oriented_self<br />
Particles are oriented around their own direction vector, which acts as their local Y axis. As the particle changes direction, so the billboard reorients itself to face this way. Good for laser fire, fireworks and other 'streaky' particles that should look like they are traveling in their own direction. <br />
perpendicular_self<br />
Particles are perpendicular to their own direction vector, which acts as their local Z axis, and their local Y axis coplanar with their own direction vector and the common up vector (see common_up_vector). The billboard never rotates to face the camera, you might use double-side mater<br />
For further documentation about the properties see<br />
http://www.ogre3d.org/docs/manual/manual_35.html#SEC191<br />
http://www.ogre3d.org/docs/manual/manual_36.html#SEC208<br />
<br />
{<br />
msg:"Sprite",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
sprite:"http://example.com/test.jpg",//the mesh should be rescaled to fit inside a unit sphere centered at 0,0,0<br />
billboard:"perpendicular_self"<br />
sorted:false//defaults to false--whether the particles should be sorted<br />
up: [0,0,1],///Only required if billboard_type is set to perpendicular_self, this vector is the common up vector used to orient all particles in the system.<br />
}<br />
<br />
===Removing point sprite property from object===<br />
{<br />
msg:"DestroySprite",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
}</div>Rrykhttps://www.sirikata.com/wiki/index.php?title=BuildTheCode&diff=697BuildTheCode2010-11-12T08:14:42Z<p>Rryk: /* Windows */</p>
<hr />
<div>= Dependencies =<br />
<br />
Sirikata depends on quite a bit of external libraries. [[Dependencies]] gives a list of required and optional libraries and a brief description of how they are used.<br />
<br />
When building Sirikata you have two options: handle installing dependencies yourself or use our install scripts or precompiled binaries (depending on platform and library) to help you build and install the required libraries. If you want to perform the installation or need to write new scripts for a new platform, see [[Dependencies]] for instructions on how to do so.<br />
<br />
These instructions explain how to use our install script or precompiled binaries.<br />
<br />
{{note}} Some dependency files are quite large, especially the precompiled binaries. When you run the install commands, it may appear the script has hung, but its likely just checking out the packages.<br />
<br />
{{note}} Throughout we'll be using the standard set of dependencies, which includes support for graphics, embedded browsers, .NET scripting via Mono, Python scripting via IronPython, and physics simulation via Bullet. If you don't need all these features, for example because you are only running the space server, you can use a different dependency target. Where <tt>depends</tt> is used, you can replace the following to get a different set of dependencies:<br />
* <tt>minimal-depends</tt> - the minimal dependencies required to get the system building<br />
* <tt>minimal-graphics-depends</tt> - the minimal dependencies required to get 3D graphics working, i.e. to get a client running. Note that not all features of the client will be enabled by this<br />
* <tt>full-depends</tt> - all dependencies, enabling all features of the system, and enables use of root to install system packages<br />
<br />
== CMake ==<br />
<br />
We use CMake on all platforms to check for dependencies and generate a build script. The Linux dependencies will install CMake from your system's package manager if you allow it to use root. Otherwise and for all Mac and Windows platforms, CMake is not installed automatically. You should use the install tool from [http://www.cmake.org the CMake website] or install it using your system's package manager. CMake 2.4 or higher is required, CMake 2.6 or higher is preferred.<br />
<br />
== Subversion ==<br />
<br />
Because of their size, dependencies are stored in subversion repositories. If you want to use our automated dependency scripts, you'll need Subversion installed. It is installed on Mac by default, Linux users can use their distributions system package, and Windows users can either install it via Cygwin or use [http://tortoisesvn.tigris.org/ TortoiseSVN].<br />
<br />
== Simple Method: Makefile ==<br />
On all platforms, if you have the right tools (make and subversion, on Windows provided by Cygwin) a Makefile is provided in the root directory which checks out and either builds or extracts dependencies to <tt>sirikata/dependencies/</tt>:<br />
<br />
cd sirikata/<br />
make depends<br />
<br />
== Windows Notes ==<br />
The simple method requires Cygwin. All dependencies are installed locally in the Sirikata checkout. No root access is needed.<br />
<br />
{{note}} The Windows dependencies support Visual Studio 2008 and later, including Express editions. Visual Studio 2005 and earlier are no longer supported.<br />
<br />
=== Without Cygwin (manual) ===<br />
If you do not have Cygwin and do not want to install it, you can perform the steps manually. Using a subversion client, checkout<br />
http://sirikatawin32.googlecode.com/svn/trunk/ <br />
to <tt>sirikata/dependencies/</tt>.<br />
<br />
Go into the dependencies folder and unzip all of the packages directly into the dependencies directory. Make sure to select "Extract Here". If that option is not available, remove the name of the zip file from the end to indicate where to extract, so it just ends with "dependencies". The extracted directories should not have the same name as the zip files.<br />
<br />
== Mac ==<br />
<br />
If you want .NET scripting support (the default scripting environment), you must install Mono manually. Install Mono 2.4 or later from [http://www.mono-project.com/] and invoke the installer .pkg. This installer will place a Mono.frameworks folder in /Library/Frameworks.<br />
<br />
== Linux ==<br />
The install script expects an Ubuntu system, 8.04 or greater.<br />
<br />
If you want to install the system package dependencies directly, issue the following command:<br />
<br />
sudo apt-get install \<br />
git-core cmake sed unzip zip automake1.9 nvidia-cg-toolkit jam g++ \<br />
libzzip-dev libxt-dev libxaw7-dev libxxf86vm-dev libxrandr-dev libfreetype6-dev \<br />
libxext-dev autoconf libtool libpcre3-dev flex bison patch libbz2-dev gawk \<br />
libglu1-mesa-dev tofrodos libglut3-dev freeglut3-dev scons libexpat1-dev \<br />
libgtk2.0-dev libnss3-dev libgconf2-dev gperf libasound2-dev subversion \<br />
libtool autoconf ruby flex libgsl0-dev libssl-dev<br />
<br />
These packages should cover everything you need for the entire system, including the graphical client and all scripting plugins.<br />
<br />
= Building Sirikata =<br />
<br />
We use CMake to generate our build scripts. Make is used on Mac and Linux and Visual Studio is used on Windows to perform the actual build. All three builds follow the same basic steps:<br />
* Run cmake, possibly modifying the configuration.<br />
* Run your build tool.<br />
<br />
== Windows ==<br />
<br />
Start up CMake and point both paths to sirikata/build/cmake. Hit configure twice. If you installed any dependencies in non-standard locations, point CMake to them now. To generate the build files, hit Generate.<br />
On Windows Vista or later you might want to change CMAKE_INSTALL_PREFIX to the directory where you want to install your compiled binaries and hit configure again, because by default CMake will generate Visual Studio project that will try to install compiled binaries into %PROGRAMFILES%. This requires administrative privileges, which is why build will fail (unless you run Visual Studio with administrative privileges or have disabled UAC).<br />
<br />
Now browse to sirikata/build/cmake. Open Sirikata.sln and run Build All.<br />
<br />
This should result in all libraries, plugins, and binaries in the sirikata/build/cmake/debug or sirikata/build/cmake release, depending on which configuration you built.<br />
<br />
=== Notes ===<br />
* There aren't standard locations to search for dependencies on Windows. Obviously we've setup the build system to work cleanly with the dependency install script. The easiest way to get a manual installation of dependencies on Windows to work is to use the same layout as the install script, where all dependencies are located in sirikata/dependencies. If you don't do this, you will almost certainly need to manually specify the locations of some libraries in CMake.<br />
* The build is known to work for VS2008, for both regular and Express versions.<br />
* If you get error 0xc0000022, check the permissions of the sirikata top-level directory. If you used Cygwin's version of git, it will remove execute permissions for security purposes. You may enable execute by doing "chmod -R +x sirikata" in cygwin, or Granting yourself Full Control in Right Click->Properties->Security (make sure to click the Replace All Entries in Child Objects checkbox).<br />
<br />
==== DLL Path under windows ====<br />
In order to get Sirikata to find the required DLLs, you must copy all DLL files from these directories into the "build/cmake" directory:<br />
* dependencies\boost_1_37_0\lib<br />
* dependencies\installed-curl<br />
* dependencies\ogre-1.6.1\bin<br />
* dependencies\SDL-1.3.0\bin<br />
* dependencies\protobufs\bin<br />
<br />
Note that Sirikata currently does not work in Windows 2000, due to the lack of a few raw input functions (RegisterRawInputDevice) that SDL 1.3 uses. This will hopefully be fixed in SDL at some point to make it use the old DirectX raw input system.<br />
<br />
== Mac and Linux ==<br />
<br />
For convenience we provide a top level makefile which performs the standard build operations. If you want a default build and have used the install script for dependencies, do:<br />
<br />
cd sirikata/<br />
make<br />
<br />
If you'd like to run the build manually, do the following (essentially what is in the makefile):<br />
<br />
cd sirikata/build/cmake<br />
cmake . [-DCMAKE_BUILD_TYPE=Debug|Release]<br />
make<br />
<br />
To interactively adjust settings, for instance to point CMake to a different version of libraries:<br />
<br />
cd sirikata/build/cmake<br />
ccmake .<br />
<br />
When the build completes you should have the libraries, plugins, and binaries in sirikata/build/cmake.<br />
<br />
=== Snow Leopard ===<br />
<br />
{{note}} Snow Leopard support is a work in progress. These instructions may or may not work. If you find these do not work, please [[Communication|let us know]].<br />
<br />
With the release of Apple's Snow Leopard (10.6) operating system comes the opportunity to have software compiled for two different architectures running on the same machine. These are the 32-bit i386 and the 64-bit x86_64 (k8). Though the Intel hardware has always had the capability to run in a 64 bit address space prior to 10.6, the operating system didn't support it. Unfortunately, the default architecture is 64, but not all libraries and dependencies are 64-bit savvy, so it is best to compile for the 32 bit architecture.<br />
<br />
===== Upgrade Xcode and Developer Tools =====<br />
<br />
First, you need to upgrade the Xcode tools to 3.2.x. On the Snow Leopard DVD, under “Optional Installs”, install “Xcode.mpkg” to get 3.2. Use all default options. Alternatively, you may want to log in and download the latest Xcode at<br />
<br />
http://developer.apple.com/mac/<br />
<br />
This is highly recommended, because the initial Xcode 3.2 seems to get into a spinning beachball state a bit too frequently. Xcode 3.2.1 seems to be a better multitasking citizen.<br />
<br />
===== Upgrade MacPorts =====<br />
<br />
Then, you need to upgrade Macports. Get the Snow Leopard disk image at<br />
<br />
http://www.macports.org/install.php<br />
<br />
After successful installation, make sure to run<br />
<br />
sudo port selfupdate<br />
<br />
Then, you need to remove all of your ports, and reinstall new ones. See the documentation at:<br />
<br />
http://trac.macports.org/wiki/Migration<br />
<br />
By default, the libraries and frameworks are compiled for the 64-bit architecture, so you need to use the ''+universal'' flag to produce both 32- and 64-bit variants.<br />
<br />
sudo port install cmake +universal zlib +universal bzip2 +universal boost +universal sqlite3 +universal<br />
<br />
===== Regenerate Makefiles =====<br />
<br />
cd build/cmake<br />
rm CMakeCache.txt<br />
cmake . -DCMAKE_CXX_FLAGS="-arch i386"<br />
<br />
Now, you should be able to build.<br />
<br />
=== Mac XCode Project ===<br />
<br />
Many mac developers prefer to use the XCode programming environment.<br />
To do this, cmake will need to be configured to use XCode<br />
<br />
Steps are:<br />
<br />
1. test -d build/cmake/Debug || mkdir build/cmake/Debug<br />
1. The directory for XCode build products.<br />
2. python csv_converter.py build/cmake/Debug/scene.csv scene.db<br />
1. Generate a scene for use by the XCode project.<br />
3. cd build/cmake<br />
4. ln -s ../../dependencies/Frameworks Frameworks<br />
1. To fix brittle paths in Xcode project.<br />
5. cmake -G Xcode .<br />
6. cd Debug<br />
7. ln -s ../Sirikata.Protocol.dll Sirikata.Protocol.dll<br />
8. ln -s ../Sirikata.Runtime.dll Sirikata.Runtime.dll<br />
9. cd ..<br />
10. Launch Xcode<br />
11. Open Sirikata.xcodeproj<br />
12. Edit project settings<br />
1. General tab<br />
2. Choose project root<br />
3. Select build/cmake directory. Click OK. Value should just be <Project File Directory>.<br />
1. This fixes a problem that prevents debugger from working.<br />
13. Project menu<br />
1. Select active target ALL_BUILD<br />
2. Select active project executable cppoh<br />
3. Select active build configuration Debug<br />
14. Close XCode to save the project<br />
15. Reopen Sirikata.xcodeproj<br />
16. Build it<br />
17. Debugger<br />
1. Confirm that you can set a break point.<br />
2. Run debugger<br />
3. Confirm that it stops at a break point.<br />
4. Confirm that it displays source code.<br />
5. (trouble shoot back to step 10)<br />
<br />
= Running Sirikata =<br />
Assuming the previous steps have completed successfully, you should have a few binaries built and you are ready to [[Running Sirikata|run Sirikata]].</div>Rrykhttps://www.sirikata.com/wiki/index.php?title=JavascriptGraphicsAPI&diff=664JavascriptGraphicsAPI2010-08-05T09:42:56Z<p>Rryk: /* Requesting intersection */</p>
<hr />
<div>=Rationale for a common API to 3d graphics systems=<br />
<br />
Objects are sent across the thread barrier to alter the current scene graph being displayed--here's why:<br />
<br />
In modern engines, graphics framerates should not be tied to physics framerates and networking events and decoding of said events should happen at the correct pace to keep up with the networknig adapter. Graphics, however, is tied to the DOM and therefore must be on the main thread. This forces both networking and physics to be on webworker threads if there is to be any threading.<br />
<br />
These threads need to advertise state changes to the main graphics thread so that the scene graph may be altered at the graphics rate. This requires that the individual physics and networking threads send timestamped events to the graphics system which drive changes to it. <br />
<br />
Since graphics can run at different rates and the updates from the network may be irregular, the graphics (main) thread needs to have a smooth interpolation scheme interpolating the current position with the timestamped updates sent by the network/physics thread(s). The interpolation scheme should use cubic interpolation using the most current update's position and orientation along with the displayed position and location of the object when that update was received to provide a smooth scheme.<br />
<br />
Below are some example objects that may be sent cross-thread. The objects are listed in JSON format so the type information should be clear from the example.<br />
<br />
=API To Graphics System=<br />
Graphics should provide a constructor method that takes in a callback and a parent DOM element and returns a class that has a send(obj) method that takes in serializable objects from other threads that modify graphics state and an optional "destroy" method which cleans up graphics state in the DOM.<br />
<br />
so a sequence of code to construct an graphics system, make an object, and destroy it could look like<br />
gfx= new GLGERenderer(callbackFunction,parentElement)<br />
gfx.send({ msg:"Create", id:"f47ac10b-58cc-4372-a567-0e02b2c3d479", time: 2181298451298491284, pos:[1,2,3], orient:[.5,0,0,.5]})<br />
gfx.destroy();<br />
<br />
=Cross thread communication from Physics and Networking to graphics=<br />
==Object Management==<br />
id's can be anything from human readable strings to uuids to integers. They just each must be unique and chosen by the user of the API<br />
===Creating a new graphics object===<br />
<br />
{<br />
msg:"Create"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
spaceid:"aaaaaaaa-bbbb-cccc-dada-134234ab98",//<-- optional (defaults to the empty space, 0)<br />
time: 2181298451298491284,//milliseconds since 1970<br />
pos:[1,2,3],<br />
vel:[.25,0,0],<br />
orient:[.5,0,0,.5]<br />
rotaxis:[0,0,1]<br />
rotvel:.25,<br />
scale:[1,1,1],//defaults to 1,1,1 if absent<br />
parent:"c46ac00b-58cc-4372-a567-0e02b2c3d479",//<-- optional (defaults to empty--toplevel if absent<br />
parentbone:"Hand"//name of the bone on the parent object that this is attached to. Assume root transform otherwise<br />
}<br />
<br />
===Moving a graphics object===<br />
//should we define that the graphics system has some sort of interp--otherwise velocity may be useless?<br />
<br />
Should we use "parentbone" or "attachment_point"?<br />
<br />
FIXME: We should default these values to the last position, not to identity. Otherwise we basically have to send everything each time.<br />
<br />
{<br />
msg:"Move"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479"<br />
time:39852398592385,//milliseconds since 1970<br />
pos:[1,2,3],<br />
vel:[.25,0,0],<br />
orient:[.5,0,0,.5],//defaults to (identity) if absent<br />
rotaxis:[0,0,1],//defaults to 0,0,1 if absent, forcing rotvel to 0 <br />
rotvel:.25,//defaults to 0 if absent<br />
scale:[1,1,1],//defaults to 1,1,1 if absent<br />
interpolate:true,//set to false if the object should snap to new position<br />
parent:"c46ac00b-58cc-4372-a567-0e02b2c3d479"//<-- optional (defaults to previous state if absent, to clear pass empty string)<br />
attachment_point:"Hand"//name of the bone on the parent object that this is attached to. Defaults to previous state if absent, to clear pass empty string<br />
}<br />
<br />
===Destroying a graphics object===<br />
<br />
{<br />
msg:"Destroy"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479"<br />
}<br />
<br />
<br />
==Managing object appearance properties==<br />
===Adding/changing mesh property for an object===<br />
{<br />
msg:"Mesh",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
type:"collada",//string that specifies file format<br />
mesh:"http://example.com/test.dae",//the mesh should be rescaled to fit inside a unit sphere centered at 0,0,0<br />
}<br />
<br />
===Updating shader property (Vertex and Fragment float4) for an object===<br />
{<br />
msg:"MeshShaderUniform"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
name:["ColorTint","HowManyIterations"]<br />
value:[[.24,.24,.25,1.0],[1,0,0,0]]<br />
type:["float4", "int4"]<br />
}<br />
<br />
===Removing mesh property for an object===<br />
{<br />
msg:"DestroyMesh",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
}<br />
<br />
===Adding/changing light property for an object===<br />
<br />
{<br />
msg:"Light"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479"<br />
diffuse_color:[.25,.5,1],<br />
specular_color: [.2,1,.5],<br />
power=1.0: //exponent on the light<br />
ambient_color: [0,0,0],<br />
light_range: 1.0e5<br />
constant_falloff: 0.5,<br />
linear_falloff: 0.2,<br />
quadratic_falloff: 0.1,<br />
cone_inner_radians: 0,<br />
cone_outer_radians: 0,<br />
cone_falloff: 0.5,<br />
type: "POINT",//options include "SPOTLIGHT" or "DIRECTIONAL"<br />
casts_shadow: true,<br />
shader:"http://www.example.com/pointLight.shader"// light shader for ray tracing (empty by default)<br />
}<br />
<br />
===Removing light property for an object===<br />
{<br />
msg:"DestroyLight"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479"<br />
}<br />
<br />
==Camera Management==<br />
<br />
===Creating camera properties on an object===<br />
{<br />
msg:"Camera"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479"<br />
}<br />
<br />
===Destroying and cleaning up a camera===<br />
{<br />
msg:"DestroyCamera"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479"<br />
}<br />
<br />
===Attach a camera to an object's texture"===<br />
{<br />
msg:AttachCamera",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479", // Camera object's id<br />
texobjid:"9a10e9c1-31fb-43e8-9a20-6545d9a62fdb", // Id of object with a mesh<br />
texname:"example.png"//overwrites this texture on the texobjid object.<br />
}<br />
===Attach a camera to a render target"===<br />
{<br />
msg:AttachCamera",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479", // Camera object's id<br />
target:0//writes to this framebuffer--- 0 for left ,1 right for stereo, etc.<br />
}<br />
===Detach a camera from its render target"===<br />
{<br />
msg:DetachCamera",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479", // Camera object's id<br />
}<br />
<br />
==Skeleton Management==<br />
<br />
===Streaming some joint locations===<br />
{<br />
msg:"AnimateBone",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
animation:"uniqueAnimationIdentifier",//so this movement can be associated with one animation and blended with others<br />
//if not specified this is a hard constraint (i.e. foot is glued to a wall in order to avoid penetrating it)<br />
weight:1.0,//the weight for prospective blending, defaults to 1.0<br />
time:1250120951209510295;//milliseconds since 1970<br />
bone:["ankle","arm"]<br />
pos:[[1,2,3],[2,3,4]]<br />
vel:[[.25,0,0],[0,0,0]]<br />
orient:[[.5,0,0,.5],[1,0,0,0]]<br />
rotaxis:[[0,0,1],[0,1,0]]<br />
rotvel:[.25,0],<br />
interpolate:true//if false then the bone should snap to the location unless smooth is set (in which case it should interpolate as quickly as possible) defaults to true<br />
}<br />
<br />
==Event handling==<br />
<br />
===Mouse Events===<br />
<br />
Messages <u>from the graphics system</u> for standard browser events:<br />
{<br />
msg:"mousemove",<br />
x:100,//X coordinate of the mouse, relative to canvas<br />
y:102,//Y coordinate of the mouse, relative to canvas<br />
}<br />
<br />
{<br />
msg:"mousedown",<br />
which:0,//right mouse button = 2, left mouse button = 0<br />
x:100,//X coordinate of the mouse, relative to canvas<br />
y:102,//Y coordinate of the mouse, relative to canvas<br />
}<br />
<br />
{<br />
msg:"mouseup",<br />
which:2,//right mouse button = 2, left mouse button = 0<br />
x:100,//X coordinate of the mouse, relative to canvas<br />
y:102,//Y coordinate of the mouse, relative to canvas<br />
}<br />
<br />
{<br />
msg:"mouseover",<br />
x:100,//X coordinate of the mouse, relative to canvas<br />
y:102,//Y coordinate of the mouse, relative to canvas<br />
}<br />
<br />
{<br />
msg:"mouseout",<br />
x:100,//X coordinate of the mouse, relative to canvas<br />
y:102,//Y coordinate of the mouse, relative to canvas<br />
}<br />
<br />
Despite the fact that click can be derived from mousedown and mouseup, we keep it for consistency with the Web.<br />
{<br />
msg:"click",<br />
which:0,//right mouse button = 2, left mouse button = 0<br />
x:100,//X coordinate of the mouse, relative to canvas<br />
y:102,//Y coordinate of the mouse, relative to canvas<br />
}<br />
<br />
As an alternative to mousemove we introduce pick message that will return additional 3D data:<br />
{<br />
msg:"pick",<br />
x:100,//X coordinate of the mouse, relative to canvas<br />
y:102,//Y coordinate of the mouse, relative to canvas<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",//which object was hit?<br />
pos:[1,2,3],//where on the surface did it hit?<br />
normal:[.5,0,.86],//what the direction of the normal is at that point<br />
}<br />
<br />
To save traffic we allow messages <u>to the graphics system</u> that enable/disable these messages:<br />
{<br />
msg:"enable",<br />
type:"mouseover"//message type<br />
}<br />
<br />
{<br />
msg:"disable",<br />
type:"pick"//message type<br />
}<br />
<br />
All other messages, such as pickover, pickout, drag and others can be derived from these. For example pickover and pickout can be derived from mouseover and pick messages (when object ID under cursor changes that it's a pickover for the new object and pickout for the old one). Another use case is to use mouseover, mouseout to enable/disable picking to reduce network traffic.<br />
<br />
===Keyboard===<br />
<br />
This messages are sent <u>from the graphics system</u> in response to user's actions towards keyboard:<br />
{<br />
msg: "keydown"//key was pressed, but not released<br />
altKey:true,<br />
metaKey:false,<br />
ctrlKey:true,<br />
repeat:false<br />
shiftKey:true<br />
which:25//see keyCode<br />
}<br />
<br />
{<br />
msg: "keyup",//key was released<br />
altKey:true,<br />
metaKey:false,<br />
ctrlKey:true,<br />
repeat:false<br />
shiftKey:true<br />
which:25//see keyCode<br />
}<br />
<br />
For consistency with the web we also allow keypress, despite the fact that it can be derived from keydown and keyup:<br />
{<br />
msg: "keypress",<br />
altKey:true,<br />
metaKey:false,<br />
ctrlKey:true,<br />
repeat:false,//we send repeated message if user holds a key, every message except first one will have this property set to true<br />
shiftKey:true,<br />
which:25//see keyCode<br />
}<br />
<br />
===Requesting intersection===<br />
<br />
Request <u>to the graphics system</u>:<br />
{<br />
msg:"raytrace",<br />
id:5,//request ID<br />
pos:[2,3,4],//origin of the ray<br />
dir:[.24,.33,.5],//direction of the ray<br />
multiple:true//if false, only return first hit, otherwise return all intersections<br />
infinite:false//if false use length of dir to specify ray length<br />
}<br />
<br />
Response <u>from the graphics system</u>:<br />
{<br />
msg:"intersections",<br />
id:5,//request ID<br />
pos:[[2,3,4],[2.23,3.32,4.49]],//positions of the points of intersections<br />
normals:[[0,1,0],[.5,0,.86]],//normals of the surface at intersections points<br />
id:["f47ac10b-58cc-4372-a567-0e02b2c3d479","a33ff133-58dd-2272-dd6a-12aadc31d173"],//object IDs for each intersected surface<br />
}<br />
<br />
=Experimental/Brainstorming ideas for the API=<br />
I decided to reserve a section of the wiki for sort of bleeding edge ideas of cool features that would be nice to have. I could have put that in the "talk" page, but I think it makes more sense here so that it will get wider exposure. These are meant to be things that would help in drawing real scenes and building real VW systems but that we haven't figured out a good API to yet.<br />
<br />
==Attaching UI elements to graphics objects==<br />
The UI will naturally need to be in HTML since that's the best established cross platform, sandboxed UI system.<br />
<br />
The user may specify a 3d location, orientation and scale for a UI dialog to be. The graphics system should do its best to scale and position the UI in the appropriate place, but the UI may be restricted to always face the camera and always be horizontal compared with the bottom of the screen on many system. The UI should not be displayed if it is completely invisible from the camera angle or smaller than 10 pixels.<br />
===Creating/Updating UI Element===<br />
{<br />
msg:"IFrame"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
uri: "http://example.com"<br />
}<br />
<br />
===Destroying UI Element===<br />
{<br />
msg:"DestroyIFrame"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
}<br />
<br />
<br />
It seems like there should be a manner aside from "embedded iframes" to get art defined in the DOM into the scene graph--perhaps the canvas tag is the way to go here? But maybe that's too webGL specific and won't work for an Ogre port of this<br />
<br />
==Attaching 3d Text to an Objects==<br />
I'm just brainstorming here: it seems like WebGL has facilities to do this efficiently, but I don't have a good use case except buildnig a rendering system inside a canvas tag or something?<br />
<br />
Perhaps the canvas tag is the way to go <br />
<br />
{<br />
msg:"Text",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
text:"This is a test of the emergency broadcast system",<br />
font:"size=+1"<br />
}<br />
<br />
{<br />
msg:"DestroyText",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
}<br />
<br />
==Particle System==<br />
===Adding a particle system to an object===<br />
This mimics the ogre interface and we introduce a number of billboard types<br />
point<br />
The default arrangement, this approximates spherical particles and the billboards always fully face the camera. <br />
oriented_common<br />
Particles are oriented around a common, typically fixed direction vector (see common_direction), which acts as their local Y axis. The billboard rotates only around this axis, giving the particle some sense of direction. Good for rainstorms, starfields etc where the particles will traveling in one direction - this is slightly faster than oriented_self (see below). <br />
oriented_self<br />
Particles are oriented around their own direction vector, which acts as their local Y axis. As the particle changes direction, so the billboard reorients itself to face this way. Good for laser fire, fireworks and other 'streaky' particles that should look like they are traveling in their own direction. <br />
perpendicular_common<br />
Particles are perpendicular to a common, typically fixed direction vector (see common_direction), which acts as their local Z axis, and their local Y axis coplanar with common direction and the common up vector (see common_up_vector). The billboard never rotates to face the camera, you might use double-side material to ensure particles never culled by back-facing. Good for aureolas, rings etc where the particles will perpendicular to the ground - this is slightly faster than perpendicular_self (see below). <br />
perpendicular_self<br />
Particles are perpendicular to their own direction vector, which acts as their local Z axis, and their local Y axis coplanar with their own direction vector and the common up vector (see common_up_vector). The billboard never rotates to face the camera, you might use double-side mater<br />
For further documentation about the properties see<br />
http://www.ogre3d.org/docs/manual/manual_35.html#SEC191<br />
http://www.ogre3d.org/docs/manual/manual_36.html#SEC208<br />
and<br />
<br />
<br />
{ <br />
msg:"ParticleSystem",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479"<br />
mesh:"http://example.com/billboard.dae"//the mesh should be rescaled to be a 1x1 mesh with <br />
particle_size:[20,20],<br />
cull_each:false<br />
quota:10000<br />
billboard:"oriented_self",<br />
sorted:false//defaults to false--whether the particles should be sorted<br />
local:false//defaults to false--if true rotation of the node after the emission of the particle will rotate it<br />
direction: [0,0,1],///the common direction for oriented_common or perpendicular_common<br />
up: [0,0,1],///Only required if billboard_type is set to perpendicular_self or perpendicular_common, this vector is the common up vector used to orient all particles in the system.<br />
accurate_facing:false//if the facing is set to the camera facing or calculated per billboard<br />
iteration_interval:.125//how often the particles are updated--if set to 0, defaults to framerate<br />
invisibility_timeout:10//how many seconds of being outside the frustum before the system stops updating<br />
}<br />
<br />
{<br />
msg:"DestroyParticleSystem"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479"<br />
}<br />
<br />
<br />
once a system is created, particles need to be emitted from it. There should be a global map of default emitters named ParticleEmitters consisting of at least<br />
"Point","Box","Cylinder","Ellipsoid","Shell","Ring" and the extra attributes are specified in http://www.ogre3d.org/docs/manual/manual_38.html<br />
{<br />
msg:"ParticleEmitter",//add or upate a particle emitter<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
name:"Flare"<br />
type:"Ring"<br />
angle 15<br />
emission_rate 75<br />
time_to_live:[2.5,3]//range between 2.5 and 3<br />
direction [0, 1, 0]//3d vector<br />
speed:[250,300]//range between 250 and 300<br />
colour_range:[[1 0 0],[0 0 1]]//random color<br />
position:[0,0,0],<br />
repeat_delay:[2.5,5]<br />
<br />
}<br />
<br />
{<br />
msg:"RemoveParticleEmitter",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
name:"Flare"<br />
}<br />
<br />
There may be forces applied to the emitters<br />
and there must be a global map of affectors called ParticleAffector from which the relevent affector is selected consisting of at least<br />
<br />
LinearForce, ColourFader, Scaler, Rotator, ColourInterpolator, ColourImage, DeflectorPlane, DirectionRandomiser<br />
The detailed definitions are contanied at http://www.ogre3d.org/docs/manual/manual_40.html#SEC234<br />
<br />
<br />
<br />
<br />
{<br />
msg:"ParticleAffector",//add or upate a particle emitter<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
name:"TheForce"<br />
type:"LinearForce"<br />
force_vector: [0 -100 0]<br />
force_application: "add"<br />
<br />
}<br />
<br />
{<br />
msg:"RemoveParticleAffector",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
name:"TheForce"<br />
}<br />
<br />
==Specifying a terrain for the world==<br />
Would need to be chunked and in some sort of widely readable format---would be nice to be able to tap into google earth for terrain--ideas for how to do this are still very very early<br />
<br />
=Deprecated API Ideas=<br />
Here we put ideas we had but decided to discard so that they don't come up again as new ideas and may be discussed here and evaluated for re-addition if someone feels strongly they should be included<br />
<br />
<br />
==Skeleton file formats==<br />
<br />
The reason these were removed is that they are too brittle (it's hard to weigh an wave and walk animation and have the steps not be half as wide) and it's difficult to keep the skeletons out of trouble (i.e. feet through the ground)<br />
so we think thta the physics system in general should send the bone positions and timestamps since it's the arbiter of what intersects what--and it can always read the skeleton file format.<br />
<br />
===Animating a skeleton based on a time based animation===<br />
{<br />
msg:"Ani",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
time:489192048120984102,///milliseconds since 1970 that the animation should be started from (skip frames if now is later)<br />
animation:"http://example.com/animation.dae",<br />
loop:false,<br />
weight:1.0 ///how strong this animation should compare with other animations that use the same bones<br />
fadein:2.3 //how many seconds to fade in<br />
}<br />
<br />
Note that the animation.dae should have annotations for loop-in point and loop-out point within the .dae so that loop can intelligently function<br />
===Stopping a skeleton based on a time based animation===<br />
{<br />
msg:"AniStop",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
animation:"http://example.com/animation.dae"<br />
fadeout:1.0//how many seconds to fade out<br />
}<br />
<br />
<br />
==Should an object just be a sprite==<br />
We figured that a collada square file may be a more compat representation for a sprite and can contain the appropriate shader, materials, etc<br />
===Making an object a point sprite===<br />
This mimics the ogre interface and we introduce a number of billboard types<br />
point<br />
The default arrangement, this approximates spherical particles and the billboards always fully face the camera. <br />
oriented_self<br />
Particles are oriented around their own direction vector, which acts as their local Y axis. As the particle changes direction, so the billboard reorients itself to face this way. Good for laser fire, fireworks and other 'streaky' particles that should look like they are traveling in their own direction. <br />
perpendicular_self<br />
Particles are perpendicular to their own direction vector, which acts as their local Z axis, and their local Y axis coplanar with their own direction vector and the common up vector (see common_up_vector). The billboard never rotates to face the camera, you might use double-side mater<br />
For further documentation about the properties see<br />
http://www.ogre3d.org/docs/manual/manual_35.html#SEC191<br />
http://www.ogre3d.org/docs/manual/manual_36.html#SEC208<br />
<br />
{<br />
msg:"Sprite",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
sprite:"http://example.com/test.jpg",//the mesh should be rescaled to fit inside a unit sphere centered at 0,0,0<br />
billboard:"perpendicular_self"<br />
sorted:false//defaults to false--whether the particles should be sorted<br />
up: [0,0,1],///Only required if billboard_type is set to perpendicular_self, this vector is the common up vector used to orient all particles in the system.<br />
}<br />
<br />
===Removing point sprite property from object===<br />
{<br />
msg:"DestroySprite",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
}</div>Rrykhttps://www.sirikata.com/wiki/index.php?title=JavascriptGraphicsAPI&diff=663JavascriptGraphicsAPI2010-08-05T09:35:32Z<p>Rryk: /* Updating shader property (Vertex and Fragment float4) for an object */</p>
<hr />
<div>=Rationale for a common API to 3d graphics systems=<br />
<br />
Objects are sent across the thread barrier to alter the current scene graph being displayed--here's why:<br />
<br />
In modern engines, graphics framerates should not be tied to physics framerates and networking events and decoding of said events should happen at the correct pace to keep up with the networknig adapter. Graphics, however, is tied to the DOM and therefore must be on the main thread. This forces both networking and physics to be on webworker threads if there is to be any threading.<br />
<br />
These threads need to advertise state changes to the main graphics thread so that the scene graph may be altered at the graphics rate. This requires that the individual physics and networking threads send timestamped events to the graphics system which drive changes to it. <br />
<br />
Since graphics can run at different rates and the updates from the network may be irregular, the graphics (main) thread needs to have a smooth interpolation scheme interpolating the current position with the timestamped updates sent by the network/physics thread(s). The interpolation scheme should use cubic interpolation using the most current update's position and orientation along with the displayed position and location of the object when that update was received to provide a smooth scheme.<br />
<br />
Below are some example objects that may be sent cross-thread. The objects are listed in JSON format so the type information should be clear from the example.<br />
<br />
=API To Graphics System=<br />
Graphics should provide a constructor method that takes in a callback and a parent DOM element and returns a class that has a send(obj) method that takes in serializable objects from other threads that modify graphics state and an optional "destroy" method which cleans up graphics state in the DOM.<br />
<br />
so a sequence of code to construct an graphics system, make an object, and destroy it could look like<br />
gfx= new GLGERenderer(callbackFunction,parentElement)<br />
gfx.send({ msg:"Create", id:"f47ac10b-58cc-4372-a567-0e02b2c3d479", time: 2181298451298491284, pos:[1,2,3], orient:[.5,0,0,.5]})<br />
gfx.destroy();<br />
<br />
=Cross thread communication from Physics and Networking to graphics=<br />
==Object Management==<br />
id's can be anything from human readable strings to uuids to integers. They just each must be unique and chosen by the user of the API<br />
===Creating a new graphics object===<br />
<br />
{<br />
msg:"Create"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
spaceid:"aaaaaaaa-bbbb-cccc-dada-134234ab98",//<-- optional (defaults to the empty space, 0)<br />
time: 2181298451298491284,//milliseconds since 1970<br />
pos:[1,2,3],<br />
vel:[.25,0,0],<br />
orient:[.5,0,0,.5]<br />
rotaxis:[0,0,1]<br />
rotvel:.25,<br />
scale:[1,1,1],//defaults to 1,1,1 if absent<br />
parent:"c46ac00b-58cc-4372-a567-0e02b2c3d479",//<-- optional (defaults to empty--toplevel if absent<br />
parentbone:"Hand"//name of the bone on the parent object that this is attached to. Assume root transform otherwise<br />
}<br />
<br />
===Moving a graphics object===<br />
//should we define that the graphics system has some sort of interp--otherwise velocity may be useless?<br />
<br />
Should we use "parentbone" or "attachment_point"?<br />
<br />
FIXME: We should default these values to the last position, not to identity. Otherwise we basically have to send everything each time.<br />
<br />
{<br />
msg:"Move"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479"<br />
time:39852398592385,//milliseconds since 1970<br />
pos:[1,2,3],<br />
vel:[.25,0,0],<br />
orient:[.5,0,0,.5],//defaults to (identity) if absent<br />
rotaxis:[0,0,1],//defaults to 0,0,1 if absent, forcing rotvel to 0 <br />
rotvel:.25,//defaults to 0 if absent<br />
scale:[1,1,1],//defaults to 1,1,1 if absent<br />
interpolate:true,//set to false if the object should snap to new position<br />
parent:"c46ac00b-58cc-4372-a567-0e02b2c3d479"//<-- optional (defaults to previous state if absent, to clear pass empty string)<br />
attachment_point:"Hand"//name of the bone on the parent object that this is attached to. Defaults to previous state if absent, to clear pass empty string<br />
}<br />
<br />
===Destroying a graphics object===<br />
<br />
{<br />
msg:"Destroy"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479"<br />
}<br />
<br />
<br />
==Managing object appearance properties==<br />
===Adding/changing mesh property for an object===<br />
{<br />
msg:"Mesh",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
type:"collada",//string that specifies file format<br />
mesh:"http://example.com/test.dae",//the mesh should be rescaled to fit inside a unit sphere centered at 0,0,0<br />
}<br />
<br />
===Updating shader property (Vertex and Fragment float4) for an object===<br />
{<br />
msg:"MeshShaderUniform"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
name:["ColorTint","HowManyIterations"]<br />
value:[[.24,.24,.25,1.0],[1,0,0,0]]<br />
type:["float4", "int4"]<br />
}<br />
<br />
===Removing mesh property for an object===<br />
{<br />
msg:"DestroyMesh",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
}<br />
<br />
===Adding/changing light property for an object===<br />
<br />
{<br />
msg:"Light"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479"<br />
diffuse_color:[.25,.5,1],<br />
specular_color: [.2,1,.5],<br />
power=1.0: //exponent on the light<br />
ambient_color: [0,0,0],<br />
light_range: 1.0e5<br />
constant_falloff: 0.5,<br />
linear_falloff: 0.2,<br />
quadratic_falloff: 0.1,<br />
cone_inner_radians: 0,<br />
cone_outer_radians: 0,<br />
cone_falloff: 0.5,<br />
type: "POINT",//options include "SPOTLIGHT" or "DIRECTIONAL"<br />
casts_shadow: true,<br />
shader:"http://www.example.com/pointLight.shader"// light shader for ray tracing (empty by default)<br />
}<br />
<br />
===Removing light property for an object===<br />
{<br />
msg:"DestroyLight"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479"<br />
}<br />
<br />
==Camera Management==<br />
<br />
===Creating camera properties on an object===<br />
{<br />
msg:"Camera"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479"<br />
}<br />
<br />
===Destroying and cleaning up a camera===<br />
{<br />
msg:"DestroyCamera"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479"<br />
}<br />
<br />
===Attach a camera to an object's texture"===<br />
{<br />
msg:AttachCamera",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479", // Camera object's id<br />
texobjid:"9a10e9c1-31fb-43e8-9a20-6545d9a62fdb", // Id of object with a mesh<br />
texname:"example.png"//overwrites this texture on the texobjid object.<br />
}<br />
===Attach a camera to a render target"===<br />
{<br />
msg:AttachCamera",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479", // Camera object's id<br />
target:0//writes to this framebuffer--- 0 for left ,1 right for stereo, etc.<br />
}<br />
===Detach a camera from its render target"===<br />
{<br />
msg:DetachCamera",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479", // Camera object's id<br />
}<br />
<br />
==Skeleton Management==<br />
<br />
===Streaming some joint locations===<br />
{<br />
msg:"AnimateBone",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
animation:"uniqueAnimationIdentifier",//so this movement can be associated with one animation and blended with others<br />
//if not specified this is a hard constraint (i.e. foot is glued to a wall in order to avoid penetrating it)<br />
weight:1.0,//the weight for prospective blending, defaults to 1.0<br />
time:1250120951209510295;//milliseconds since 1970<br />
bone:["ankle","arm"]<br />
pos:[[1,2,3],[2,3,4]]<br />
vel:[[.25,0,0],[0,0,0]]<br />
orient:[[.5,0,0,.5],[1,0,0,0]]<br />
rotaxis:[[0,0,1],[0,1,0]]<br />
rotvel:[.25,0],<br />
interpolate:true//if false then the bone should snap to the location unless smooth is set (in which case it should interpolate as quickly as possible) defaults to true<br />
}<br />
<br />
==Event handling==<br />
<br />
===Mouse Events===<br />
<br />
Messages <u>from the graphics system</u> for standard browser events:<br />
{<br />
msg:"mousemove",<br />
x:100,//X coordinate of the mouse, relative to canvas<br />
y:102,//Y coordinate of the mouse, relative to canvas<br />
}<br />
<br />
{<br />
msg:"mousedown",<br />
which:0,//right mouse button = 2, left mouse button = 0<br />
x:100,//X coordinate of the mouse, relative to canvas<br />
y:102,//Y coordinate of the mouse, relative to canvas<br />
}<br />
<br />
{<br />
msg:"mouseup",<br />
which:2,//right mouse button = 2, left mouse button = 0<br />
x:100,//X coordinate of the mouse, relative to canvas<br />
y:102,//Y coordinate of the mouse, relative to canvas<br />
}<br />
<br />
{<br />
msg:"mouseover",<br />
x:100,//X coordinate of the mouse, relative to canvas<br />
y:102,//Y coordinate of the mouse, relative to canvas<br />
}<br />
<br />
{<br />
msg:"mouseout",<br />
x:100,//X coordinate of the mouse, relative to canvas<br />
y:102,//Y coordinate of the mouse, relative to canvas<br />
}<br />
<br />
Despite the fact that click can be derived from mousedown and mouseup, we keep it for consistency with the Web.<br />
{<br />
msg:"click",<br />
which:0,//right mouse button = 2, left mouse button = 0<br />
x:100,//X coordinate of the mouse, relative to canvas<br />
y:102,//Y coordinate of the mouse, relative to canvas<br />
}<br />
<br />
As an alternative to mousemove we introduce pick message that will return additional 3D data:<br />
{<br />
msg:"pick",<br />
x:100,//X coordinate of the mouse, relative to canvas<br />
y:102,//Y coordinate of the mouse, relative to canvas<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",//which object was hit?<br />
pos:[1,2,3],//where on the surface did it hit?<br />
normal:[.5,0,.86],//what the direction of the normal is at that point<br />
}<br />
<br />
To save traffic we allow messages <u>to the graphics system</u> that enable/disable these messages:<br />
{<br />
msg:"enable",<br />
type:"mouseover"//message type<br />
}<br />
<br />
{<br />
msg:"disable",<br />
type:"pick"//message type<br />
}<br />
<br />
All other messages, such as pickover, pickout, drag and others can be derived from these. For example pickover and pickout can be derived from mouseover and pick messages (when object ID under cursor changes that it's a pickover for the new object and pickout for the old one). Another use case is to use mouseover, mouseout to enable/disable picking to reduce network traffic.<br />
<br />
===Keyboard===<br />
<br />
This messages are sent <u>from the graphics system</u> in response to user's actions towards keyboard:<br />
{<br />
msg: "keydown"//key was pressed, but not released<br />
altKey:true,<br />
metaKey:false,<br />
ctrlKey:true,<br />
repeat:false<br />
shiftKey:true<br />
which:25//see keyCode<br />
}<br />
<br />
{<br />
msg: "keyup",//key was released<br />
altKey:true,<br />
metaKey:false,<br />
ctrlKey:true,<br />
repeat:false<br />
shiftKey:true<br />
which:25//see keyCode<br />
}<br />
<br />
For consistency with the web we also allow keypress, despite the fact that it can be derived from keydown and keyup:<br />
{<br />
msg: "keypress",<br />
altKey:true,<br />
metaKey:false,<br />
ctrlKey:true,<br />
repeat:false,//we send repeated message if user holds a key, every message except first one will have this property set to true<br />
shiftKey:true,<br />
which:25//see keyCode<br />
}<br />
<br />
===Requesting intersection===<br />
<br />
Request <u>to the graphics system</u>:<br />
{<br />
msg:"raytrace",<br />
id:5,//request ID<br />
pos:[2,3,4],//origin of the ray<br />
dir:[.24,.33,.5],//direction of the ray<br />
multiple:true//if false, only return first hit, otherwise return all intersections<br />
infinite:false//if false use length of dir to specify ray length<br />
}<br />
<br />
Response <u>from the graphics system</u>:<br />
{<br />
msg:"intersections",<br />
id:5,//request ID<br />
pos:[[2,3,4],[2.23,3.32,4.49]],//positions of the points of intersections<br />
normals:[[0,1,0],[.5,0,.86]],//normals of the surface at intersections points<br />
id:["f47ac10b-58cc-4372-a567-0e02b2c3d479","a33ff133-58dd-2272-dd6a-12aadc31d173",//object IDs for each intersected surface<br />
}<br />
<br />
<br />
=Experimental/Brainstorming ideas for the API=<br />
I decided to reserve a section of the wiki for sort of bleeding edge ideas of cool features that would be nice to have. I could have put that in the "talk" page, but I think it makes more sense here so that it will get wider exposure. These are meant to be things that would help in drawing real scenes and building real VW systems but that we haven't figured out a good API to yet.<br />
<br />
==Attaching UI elements to graphics objects==<br />
The UI will naturally need to be in HTML since that's the best established cross platform, sandboxed UI system.<br />
<br />
The user may specify a 3d location, orientation and scale for a UI dialog to be. The graphics system should do its best to scale and position the UI in the appropriate place, but the UI may be restricted to always face the camera and always be horizontal compared with the bottom of the screen on many system. The UI should not be displayed if it is completely invisible from the camera angle or smaller than 10 pixels.<br />
===Creating/Updating UI Element===<br />
{<br />
msg:"IFrame"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
uri: "http://example.com"<br />
}<br />
<br />
===Destroying UI Element===<br />
{<br />
msg:"DestroyIFrame"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
}<br />
<br />
<br />
It seems like there should be a manner aside from "embedded iframes" to get art defined in the DOM into the scene graph--perhaps the canvas tag is the way to go here? But maybe that's too webGL specific and won't work for an Ogre port of this<br />
<br />
==Attaching 3d Text to an Objects==<br />
I'm just brainstorming here: it seems like WebGL has facilities to do this efficiently, but I don't have a good use case except buildnig a rendering system inside a canvas tag or something?<br />
<br />
Perhaps the canvas tag is the way to go <br />
<br />
{<br />
msg:"Text",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
text:"This is a test of the emergency broadcast system",<br />
font:"size=+1"<br />
}<br />
<br />
{<br />
msg:"DestroyText",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
}<br />
<br />
==Particle System==<br />
===Adding a particle system to an object===<br />
This mimics the ogre interface and we introduce a number of billboard types<br />
point<br />
The default arrangement, this approximates spherical particles and the billboards always fully face the camera. <br />
oriented_common<br />
Particles are oriented around a common, typically fixed direction vector (see common_direction), which acts as their local Y axis. The billboard rotates only around this axis, giving the particle some sense of direction. Good for rainstorms, starfields etc where the particles will traveling in one direction - this is slightly faster than oriented_self (see below). <br />
oriented_self<br />
Particles are oriented around their own direction vector, which acts as their local Y axis. As the particle changes direction, so the billboard reorients itself to face this way. Good for laser fire, fireworks and other 'streaky' particles that should look like they are traveling in their own direction. <br />
perpendicular_common<br />
Particles are perpendicular to a common, typically fixed direction vector (see common_direction), which acts as their local Z axis, and their local Y axis coplanar with common direction and the common up vector (see common_up_vector). The billboard never rotates to face the camera, you might use double-side material to ensure particles never culled by back-facing. Good for aureolas, rings etc where the particles will perpendicular to the ground - this is slightly faster than perpendicular_self (see below). <br />
perpendicular_self<br />
Particles are perpendicular to their own direction vector, which acts as their local Z axis, and their local Y axis coplanar with their own direction vector and the common up vector (see common_up_vector). The billboard never rotates to face the camera, you might use double-side mater<br />
For further documentation about the properties see<br />
http://www.ogre3d.org/docs/manual/manual_35.html#SEC191<br />
http://www.ogre3d.org/docs/manual/manual_36.html#SEC208<br />
and<br />
<br />
<br />
{ <br />
msg:"ParticleSystem",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479"<br />
mesh:"http://example.com/billboard.dae"//the mesh should be rescaled to be a 1x1 mesh with <br />
particle_size:[20,20],<br />
cull_each:false<br />
quota:10000<br />
billboard:"oriented_self",<br />
sorted:false//defaults to false--whether the particles should be sorted<br />
local:false//defaults to false--if true rotation of the node after the emission of the particle will rotate it<br />
direction: [0,0,1],///the common direction for oriented_common or perpendicular_common<br />
up: [0,0,1],///Only required if billboard_type is set to perpendicular_self or perpendicular_common, this vector is the common up vector used to orient all particles in the system.<br />
accurate_facing:false//if the facing is set to the camera facing or calculated per billboard<br />
iteration_interval:.125//how often the particles are updated--if set to 0, defaults to framerate<br />
invisibility_timeout:10//how many seconds of being outside the frustum before the system stops updating<br />
}<br />
<br />
{<br />
msg:"DestroyParticleSystem"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479"<br />
}<br />
<br />
<br />
once a system is created, particles need to be emitted from it. There should be a global map of default emitters named ParticleEmitters consisting of at least<br />
"Point","Box","Cylinder","Ellipsoid","Shell","Ring" and the extra attributes are specified in http://www.ogre3d.org/docs/manual/manual_38.html<br />
{<br />
msg:"ParticleEmitter",//add or upate a particle emitter<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
name:"Flare"<br />
type:"Ring"<br />
angle 15<br />
emission_rate 75<br />
time_to_live:[2.5,3]//range between 2.5 and 3<br />
direction [0, 1, 0]//3d vector<br />
speed:[250,300]//range between 250 and 300<br />
colour_range:[[1 0 0],[0 0 1]]//random color<br />
position:[0,0,0],<br />
repeat_delay:[2.5,5]<br />
<br />
}<br />
<br />
{<br />
msg:"RemoveParticleEmitter",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
name:"Flare"<br />
}<br />
<br />
There may be forces applied to the emitters<br />
and there must be a global map of affectors called ParticleAffector from which the relevent affector is selected consisting of at least<br />
<br />
LinearForce, ColourFader, Scaler, Rotator, ColourInterpolator, ColourImage, DeflectorPlane, DirectionRandomiser<br />
The detailed definitions are contanied at http://www.ogre3d.org/docs/manual/manual_40.html#SEC234<br />
<br />
<br />
<br />
<br />
{<br />
msg:"ParticleAffector",//add or upate a particle emitter<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
name:"TheForce"<br />
type:"LinearForce"<br />
force_vector: [0 -100 0]<br />
force_application: "add"<br />
<br />
}<br />
<br />
{<br />
msg:"RemoveParticleAffector",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
name:"TheForce"<br />
}<br />
<br />
==Specifying a terrain for the world==<br />
Would need to be chunked and in some sort of widely readable format---would be nice to be able to tap into google earth for terrain--ideas for how to do this are still very very early<br />
<br />
=Deprecated API Ideas=<br />
Here we put ideas we had but decided to discard so that they don't come up again as new ideas and may be discussed here and evaluated for re-addition if someone feels strongly they should be included<br />
<br />
<br />
==Skeleton file formats==<br />
<br />
The reason these were removed is that they are too brittle (it's hard to weigh an wave and walk animation and have the steps not be half as wide) and it's difficult to keep the skeletons out of trouble (i.e. feet through the ground)<br />
so we think thta the physics system in general should send the bone positions and timestamps since it's the arbiter of what intersects what--and it can always read the skeleton file format.<br />
<br />
===Animating a skeleton based on a time based animation===<br />
{<br />
msg:"Ani",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
time:489192048120984102,///milliseconds since 1970 that the animation should be started from (skip frames if now is later)<br />
animation:"http://example.com/animation.dae",<br />
loop:false,<br />
weight:1.0 ///how strong this animation should compare with other animations that use the same bones<br />
fadein:2.3 //how many seconds to fade in<br />
}<br />
<br />
Note that the animation.dae should have annotations for loop-in point and loop-out point within the .dae so that loop can intelligently function<br />
===Stopping a skeleton based on a time based animation===<br />
{<br />
msg:"AniStop",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
animation:"http://example.com/animation.dae"<br />
fadeout:1.0//how many seconds to fade out<br />
}<br />
<br />
<br />
==Should an object just be a sprite==<br />
We figured that a collada square file may be a more compat representation for a sprite and can contain the appropriate shader, materials, etc<br />
===Making an object a point sprite===<br />
This mimics the ogre interface and we introduce a number of billboard types<br />
point<br />
The default arrangement, this approximates spherical particles and the billboards always fully face the camera. <br />
oriented_self<br />
Particles are oriented around their own direction vector, which acts as their local Y axis. As the particle changes direction, so the billboard reorients itself to face this way. Good for laser fire, fireworks and other 'streaky' particles that should look like they are traveling in their own direction. <br />
perpendicular_self<br />
Particles are perpendicular to their own direction vector, which acts as their local Z axis, and their local Y axis coplanar with their own direction vector and the common up vector (see common_up_vector). The billboard never rotates to face the camera, you might use double-side mater<br />
For further documentation about the properties see<br />
http://www.ogre3d.org/docs/manual/manual_35.html#SEC191<br />
http://www.ogre3d.org/docs/manual/manual_36.html#SEC208<br />
<br />
{<br />
msg:"Sprite",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
sprite:"http://example.com/test.jpg",//the mesh should be rescaled to fit inside a unit sphere centered at 0,0,0<br />
billboard:"perpendicular_self"<br />
sorted:false//defaults to false--whether the particles should be sorted<br />
up: [0,0,1],///Only required if billboard_type is set to perpendicular_self, this vector is the common up vector used to orient all particles in the system.<br />
}<br />
<br />
===Removing point sprite property from object===<br />
{<br />
msg:"DestroySprite",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
}</div>Rrykhttps://www.sirikata.com/wiki/index.php?title=JavascriptGraphicsAPI&diff=661JavascriptGraphicsAPI2010-08-03T15:43:48Z<p>Rryk: /* Keyboard */</p>
<hr />
<div>=Rationale for a common API to 3d graphics systems=<br />
<br />
Objects are sent across the thread barrier to alter the current scene graph being displayed--here's why:<br />
<br />
In modern engines, graphics framerates should not be tied to physics framerates and networking events and decoding of said events should happen at the correct pace to keep up with the networknig adapter. Graphics, however, is tied to the DOM and therefore must be on the main thread. This forces both networking and physics to be on webworker threads if there is to be any threading.<br />
<br />
These threads need to advertise state changes to the main graphics thread so that the scene graph may be altered at the graphics rate. This requires that the individual physics and networking threads send timestamped events to the graphics system which drive changes to it. <br />
<br />
Since graphics can run at different rates and the updates from the network may be irregular, the graphics (main) thread needs to have a smooth interpolation scheme interpolating the current position with the timestamped updates sent by the network/physics thread(s). The interpolation scheme should use cubic interpolation using the most current update's position and orientation along with the displayed position and location of the object when that update was received to provide a smooth scheme.<br />
<br />
Below are some example objects that may be sent cross-thread. The objects are listed in JSON format so the type information should be clear from the example.<br />
<br />
=API To Graphics System=<br />
Graphics should provide a constructor method that takes in a callback and a parent DOM element and returns a class that has a send(obj) method that takes in serializable objects from other threads that modify graphics state and an optional "destroy" method which cleans up graphics state in the DOM.<br />
<br />
so a sequence of code to construct an graphics system, make an object, and destroy it could look like<br />
gfx= new GLGERenderer(callbackFunction,parentElement)<br />
gfx.send({ msg:"Create", id:"f47ac10b-58cc-4372-a567-0e02b2c3d479", time: 2181298451298491284, pos:[1,2,3], orient:[.5,0,0,.5]})<br />
gfx.destroy();<br />
<br />
=Cross thread communication from Physics and Networking to graphics=<br />
==Object Management==<br />
id's can be anything from human readable strings to uuids to integers. They just each must be unique and chosen by the user of the API<br />
===Creating a new graphics object===<br />
<br />
{<br />
msg:"Create"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
spaceid:"aaaaaaaa-bbbb-cccc-dada-134234ab98",//<-- optional (defaults to the empty space, 0)<br />
time: 2181298451298491284,//milliseconds since 1970<br />
pos:[1,2,3],<br />
vel:[.25,0,0],<br />
orient:[.5,0,0,.5]<br />
rotaxis:[0,0,1]<br />
rotvel:.25,<br />
scale:[1,1,1],//defaults to 1,1,1 if absent<br />
parent:"c46ac00b-58cc-4372-a567-0e02b2c3d479",//<-- optional (defaults to empty--toplevel if absent<br />
parentbone:"Hand"//name of the bone on the parent object that this is attached to. Assume root transform otherwise<br />
}<br />
<br />
===Moving a graphics object===<br />
//should we define that the graphics system has some sort of interp--otherwise velocity may be useless?<br />
<br />
Should we use "parentbone" or "attachment_point"?<br />
<br />
FIXME: We should default these values to the last position, not to identity. Otherwise we basically have to send everything each time.<br />
<br />
{<br />
msg:"Move"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479"<br />
time:39852398592385,//milliseconds since 1970<br />
pos:[1,2,3],<br />
vel:[.25,0,0],<br />
orient:[.5,0,0,.5],//defaults to (identity) if absent<br />
rotaxis:[0,0,1],//defaults to 0,0,1 if absent, forcing rotvel to 0 <br />
rotvel:.25,//defaults to 0 if absent<br />
scale:[1,1,1],//defaults to 1,1,1 if absent<br />
interpolate:true,//set to false if the object should snap to new position<br />
parent:"c46ac00b-58cc-4372-a567-0e02b2c3d479"//<-- optional (defaults to previous state if absent, to clear pass empty string)<br />
attachment_point:"Hand"//name of the bone on the parent object that this is attached to. Defaults to previous state if absent, to clear pass empty string<br />
}<br />
<br />
===Destroying a graphics object===<br />
<br />
{<br />
msg:"Destroy"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479"<br />
}<br />
<br />
<br />
==Managing object appearance properties==<br />
===Adding/changing mesh property for an object===<br />
{<br />
msg:"Mesh",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
type:"collada",//string that specifies file format<br />
mesh:"http://example.com/test.dae",//the mesh should be rescaled to fit inside a unit sphere centered at 0,0,0<br />
}<br />
<br />
===Updating shader property (Vertex and Fragment float4) for an object===<br />
{<br />
msg:"MeshShaderUniform"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
name:["ColorTint","HowManyIterations"]<br />
value:[[.24,.24,.25,1.0],[1,0,0,0]]<br />
type:"float4"<br />
}<br />
<br />
===Removing mesh property for an object===<br />
{<br />
msg:"DestroyMesh",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
}<br />
<br />
===Adding/changing light property for an object===<br />
<br />
{<br />
msg:"Light"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479"<br />
diffuse_color:[.25,.5,1],<br />
specular_color: [.2,1,.5],<br />
power=1.0: //exponent on the light<br />
ambient_color: [0,0,0],<br />
light_range: 1.0e5<br />
constant_falloff: 0.5,<br />
linear_falloff: 0.2,<br />
quadratic_falloff: 0.1,<br />
cone_inner_radians: 0,<br />
cone_outer_radians: 0,<br />
cone_falloff: 0.5,<br />
type: "POINT",//options include "SPOTLIGHT" or "DIRECTIONAL"<br />
casts_shadow: true,<br />
shader:"http://www.example.com/pointLight.shader"// light shader for ray tracing (empty by default)<br />
}<br />
<br />
===Removing light property for an object===<br />
{<br />
msg:"DestroyLight"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479"<br />
}<br />
<br />
==Camera Management==<br />
<br />
===Creating camera properties on an object===<br />
{<br />
msg:"Camera"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479"<br />
}<br />
<br />
===Destroying and cleaning up a camera===<br />
{<br />
msg:"DestroyCamera"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479"<br />
}<br />
<br />
===Attach a camera to an object's texture"===<br />
{<br />
msg:AttachCamera",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479", // Camera object's id<br />
texobjid:"9a10e9c1-31fb-43e8-9a20-6545d9a62fdb", // Id of object with a mesh<br />
texname:"example.png"//overwrites this texture on the texobjid object.<br />
}<br />
===Attach a camera to a render target"===<br />
{<br />
msg:AttachCamera",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479", // Camera object's id<br />
target:0//writes to this framebuffer--- 0 for left ,1 right for stereo, etc.<br />
}<br />
===Detach a camera from its render target"===<br />
{<br />
msg:DetachCamera",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479", // Camera object's id<br />
}<br />
<br />
==Skeleton Management==<br />
<br />
===Streaming some joint locations===<br />
{<br />
msg:"AnimateBone",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
animation:"uniqueAnimationIdentifier",//so this movement can be associated with one animation and blended with others<br />
//if not specified this is a hard constraint (i.e. foot is glued to a wall in order to avoid penetrating it)<br />
weight:1.0,//the weight for prospective blending, defaults to 1.0<br />
time:1250120951209510295;//milliseconds since 1970<br />
bone:["ankle","arm"]<br />
pos:[[1,2,3],[2,3,4]]<br />
vel:[[.25,0,0],[0,0,0]]<br />
orient:[[.5,0,0,.5],[1,0,0,0]]<br />
rotaxis:[[0,0,1],[0,1,0]]<br />
rotvel:[.25,0],<br />
interpolate:true//if false then the bone should snap to the location unless smooth is set (in which case it should interpolate as quickly as possible) defaults to true<br />
}<br />
<br />
==Event handling==<br />
<br />
===Mouse Events===<br />
<br />
Messages <u>from the graphics system</u> for standard browser events:<br />
{<br />
msg:"mousemove",<br />
x:100,//X coordinate of the mouse, relative to canvas<br />
y:102,//Y coordinate of the mouse, relative to canvas<br />
}<br />
<br />
{<br />
msg:"mousedown",<br />
which:0,//right mouse button = 2, left mouse button = 0<br />
x:100,//X coordinate of the mouse, relative to canvas<br />
y:102,//Y coordinate of the mouse, relative to canvas<br />
}<br />
<br />
{<br />
msg:"mouseup",<br />
which:2,//right mouse button = 2, left mouse button = 0<br />
x:100,//X coordinate of the mouse, relative to canvas<br />
y:102,//Y coordinate of the mouse, relative to canvas<br />
}<br />
<br />
{<br />
msg:"mouseover",<br />
x:100,//X coordinate of the mouse, relative to canvas<br />
y:102,//Y coordinate of the mouse, relative to canvas<br />
}<br />
<br />
{<br />
msg:"mouseout",<br />
x:100,//X coordinate of the mouse, relative to canvas<br />
y:102,//Y coordinate of the mouse, relative to canvas<br />
}<br />
<br />
Despite the fact that click can be derived from mousedown and mouseup, we keep it for consistency with the Web.<br />
{<br />
msg:"click",<br />
which:0,//right mouse button = 2, left mouse button = 0<br />
x:100,//X coordinate of the mouse, relative to canvas<br />
y:102,//Y coordinate of the mouse, relative to canvas<br />
}<br />
<br />
As an alternative to mousemove we introduce pick message that will return additional 3D data:<br />
{<br />
msg:"pick",<br />
x:100,//X coordinate of the mouse, relative to canvas<br />
y:102,//Y coordinate of the mouse, relative to canvas<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",//which object was hit?<br />
pos:[1,2,3],//where on the surface did it hit?<br />
normal:[.5,0,.86],//what the direction of the normal is at that point<br />
}<br />
<br />
To save traffic we allow messages <u>to the graphics system</u> that enable/disable these messages:<br />
{<br />
msg:"enable",<br />
type:"mouseover"//message type<br />
}<br />
<br />
{<br />
msg:"disable",<br />
type:"pick"//message type<br />
}<br />
<br />
All other messages, such as pickover, pickout, drag and others can be derived from these. For example pickover and pickout can be derived from mouseover and pick messages (when object ID under cursor changes that it's a pickover for the new object and pickout for the old one). Another use case is to use mouseover, mouseout to enable/disable picking to reduce network traffic.<br />
<br />
===Keyboard===<br />
<br />
This messages are sent <u>from the graphics system</u> in response to user's actions towards keyboard:<br />
{<br />
msg: "keydown"//key was pressed, but not released<br />
altKey:true,<br />
metaKey:false,<br />
ctrlKey:true,<br />
repeat:false<br />
shiftKey:true<br />
which:25//see keyCode<br />
}<br />
<br />
{<br />
msg: "keyup",//key was released<br />
altKey:true,<br />
metaKey:false,<br />
ctrlKey:true,<br />
repeat:false<br />
shiftKey:true<br />
which:25//see keyCode<br />
}<br />
<br />
For consistency with the web we also allow keypress, despite the fact that it can be derived from keydown and keyup:<br />
{<br />
msg: "keypress",<br />
altKey:true,<br />
metaKey:false,<br />
ctrlKey:true,<br />
repeat:false,//we send repeated message if user holds a key, every message except first one will have this property set to true<br />
shiftKey:true,<br />
which:25//see keyCode<br />
}<br />
<br />
===Requesting intersection===<br />
<br />
Request <u>to the graphics system</u>:<br />
{<br />
msg:"raytrace",<br />
id:5,//request ID<br />
pos:[2,3,4],//origin of the ray<br />
dir:[.24,.33,.5],//direction of the ray<br />
multiple:true//if false, only return first hit, otherwise return all intersections<br />
infinite:false//if false use length of dir to specify ray length<br />
}<br />
<br />
Response <u>from the graphics system</u>:<br />
{<br />
msg:"intersections",<br />
id:5,//request ID<br />
pos:[[2,3,4],[2.23,3.32,4.49]],//positions of the points of intersections<br />
normals:[[0,1,0],[.5,0,.86]],//normals of the surface at intersections points<br />
id:["f47ac10b-58cc-4372-a567-0e02b2c3d479","a33ff133-58dd-2272-dd6a-12aadc31d173",//object IDs for each intersected surface<br />
}<br />
<br />
<br />
=Experimental/Brainstorming ideas for the API=<br />
I decided to reserve a section of the wiki for sort of bleeding edge ideas of cool features that would be nice to have. I could have put that in the "talk" page, but I think it makes more sense here so that it will get wider exposure. These are meant to be things that would help in drawing real scenes and building real VW systems but that we haven't figured out a good API to yet.<br />
<br />
==Attaching UI elements to graphics objects==<br />
The UI will naturally need to be in HTML since that's the best established cross platform, sandboxed UI system.<br />
<br />
The user may specify a 3d location, orientation and scale for a UI dialog to be. The graphics system should do its best to scale and position the UI in the appropriate place, but the UI may be restricted to always face the camera and always be horizontal compared with the bottom of the screen on many system. The UI should not be displayed if it is completely invisible from the camera angle or smaller than 10 pixels.<br />
===Creating/Updating UI Element===<br />
{<br />
msg:"IFrame"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
uri: "http://example.com"<br />
}<br />
<br />
===Destroying UI Element===<br />
{<br />
msg:"DestroyIFrame"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
}<br />
<br />
<br />
It seems like there should be a manner aside from "embedded iframes" to get art defined in the DOM into the scene graph--perhaps the canvas tag is the way to go here? But maybe that's too webGL specific and won't work for an Ogre port of this<br />
<br />
==Attaching 3d Text to an Objects==<br />
I'm just brainstorming here: it seems like WebGL has facilities to do this efficiently, but I don't have a good use case except buildnig a rendering system inside a canvas tag or something?<br />
<br />
Perhaps the canvas tag is the way to go <br />
<br />
{<br />
msg:"Text",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
text:"This is a test of the emergency broadcast system",<br />
font:"size=+1"<br />
}<br />
<br />
{<br />
msg:"DestroyText",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
}<br />
<br />
==Particle System==<br />
===Adding a particle system to an object===<br />
This mimics the ogre interface and we introduce a number of billboard types<br />
point<br />
The default arrangement, this approximates spherical particles and the billboards always fully face the camera. <br />
oriented_common<br />
Particles are oriented around a common, typically fixed direction vector (see common_direction), which acts as their local Y axis. The billboard rotates only around this axis, giving the particle some sense of direction. Good for rainstorms, starfields etc where the particles will traveling in one direction - this is slightly faster than oriented_self (see below). <br />
oriented_self<br />
Particles are oriented around their own direction vector, which acts as their local Y axis. As the particle changes direction, so the billboard reorients itself to face this way. Good for laser fire, fireworks and other 'streaky' particles that should look like they are traveling in their own direction. <br />
perpendicular_common<br />
Particles are perpendicular to a common, typically fixed direction vector (see common_direction), which acts as their local Z axis, and their local Y axis coplanar with common direction and the common up vector (see common_up_vector). The billboard never rotates to face the camera, you might use double-side material to ensure particles never culled by back-facing. Good for aureolas, rings etc where the particles will perpendicular to the ground - this is slightly faster than perpendicular_self (see below). <br />
perpendicular_self<br />
Particles are perpendicular to their own direction vector, which acts as their local Z axis, and their local Y axis coplanar with their own direction vector and the common up vector (see common_up_vector). The billboard never rotates to face the camera, you might use double-side mater<br />
For further documentation about the properties see<br />
http://www.ogre3d.org/docs/manual/manual_35.html#SEC191<br />
http://www.ogre3d.org/docs/manual/manual_36.html#SEC208<br />
and<br />
<br />
<br />
{ <br />
msg:"ParticleSystem",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479"<br />
mesh:"http://example.com/billboard.dae"//the mesh should be rescaled to be a 1x1 mesh with <br />
particle_size:[20,20],<br />
cull_each:false<br />
quota:10000<br />
billboard:"oriented_self",<br />
sorted:false//defaults to false--whether the particles should be sorted<br />
local:false//defaults to false--if true rotation of the node after the emission of the particle will rotate it<br />
direction: [0,0,1],///the common direction for oriented_common or perpendicular_common<br />
up: [0,0,1],///Only required if billboard_type is set to perpendicular_self or perpendicular_common, this vector is the common up vector used to orient all particles in the system.<br />
accurate_facing:false//if the facing is set to the camera facing or calculated per billboard<br />
iteration_interval:.125//how often the particles are updated--if set to 0, defaults to framerate<br />
invisibility_timeout:10//how many seconds of being outside the frustum before the system stops updating<br />
}<br />
<br />
{<br />
msg:"DestroyParticleSystem"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479"<br />
}<br />
<br />
<br />
once a system is created, particles need to be emitted from it. There should be a global map of default emitters named ParticleEmitters consisting of at least<br />
"Point","Box","Cylinder","Ellipsoid","Shell","Ring" and the extra attributes are specified in http://www.ogre3d.org/docs/manual/manual_38.html<br />
{<br />
msg:"ParticleEmitter",//add or upate a particle emitter<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
name:"Flare"<br />
type:"Ring"<br />
angle 15<br />
emission_rate 75<br />
time_to_live:[2.5,3]//range between 2.5 and 3<br />
direction [0, 1, 0]//3d vector<br />
speed:[250,300]//range between 250 and 300<br />
colour_range:[[1 0 0],[0 0 1]]//random color<br />
position:[0,0,0],<br />
repeat_delay:[2.5,5]<br />
<br />
}<br />
<br />
{<br />
msg:"RemoveParticleEmitter",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
name:"Flare"<br />
}<br />
<br />
There may be forces applied to the emitters<br />
and there must be a global map of affectors called ParticleAffector from which the relevent affector is selected consisting of at least<br />
<br />
LinearForce, ColourFader, Scaler, Rotator, ColourInterpolator, ColourImage, DeflectorPlane, DirectionRandomiser<br />
The detailed definitions are contanied at http://www.ogre3d.org/docs/manual/manual_40.html#SEC234<br />
<br />
<br />
<br />
<br />
{<br />
msg:"ParticleAffector",//add or upate a particle emitter<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
name:"TheForce"<br />
type:"LinearForce"<br />
force_vector: [0 -100 0]<br />
force_application: "add"<br />
<br />
}<br />
<br />
{<br />
msg:"RemoveParticleAffector",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
name:"TheForce"<br />
}<br />
<br />
==Specifying a terrain for the world==<br />
Would need to be chunked and in some sort of widely readable format---would be nice to be able to tap into google earth for terrain--ideas for how to do this are still very very early<br />
<br />
=Deprecated API Ideas=<br />
Here we put ideas we had but decided to discard so that they don't come up again as new ideas and may be discussed here and evaluated for re-addition if someone feels strongly they should be included<br />
<br />
<br />
==Skeleton file formats==<br />
<br />
The reason these were removed is that they are too brittle (it's hard to weigh an wave and walk animation and have the steps not be half as wide) and it's difficult to keep the skeletons out of trouble (i.e. feet through the ground)<br />
so we think thta the physics system in general should send the bone positions and timestamps since it's the arbiter of what intersects what--and it can always read the skeleton file format.<br />
<br />
===Animating a skeleton based on a time based animation===<br />
{<br />
msg:"Ani",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
time:489192048120984102,///milliseconds since 1970 that the animation should be started from (skip frames if now is later)<br />
animation:"http://example.com/animation.dae",<br />
loop:false,<br />
weight:1.0 ///how strong this animation should compare with other animations that use the same bones<br />
fadein:2.3 //how many seconds to fade in<br />
}<br />
<br />
Note that the animation.dae should have annotations for loop-in point and loop-out point within the .dae so that loop can intelligently function<br />
===Stopping a skeleton based on a time based animation===<br />
{<br />
msg:"AniStop",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
animation:"http://example.com/animation.dae"<br />
fadeout:1.0//how many seconds to fade out<br />
}<br />
<br />
<br />
==Should an object just be a sprite==<br />
We figured that a collada square file may be a more compat representation for a sprite and can contain the appropriate shader, materials, etc<br />
===Making an object a point sprite===<br />
This mimics the ogre interface and we introduce a number of billboard types<br />
point<br />
The default arrangement, this approximates spherical particles and the billboards always fully face the camera. <br />
oriented_self<br />
Particles are oriented around their own direction vector, which acts as their local Y axis. As the particle changes direction, so the billboard reorients itself to face this way. Good for laser fire, fireworks and other 'streaky' particles that should look like they are traveling in their own direction. <br />
perpendicular_self<br />
Particles are perpendicular to their own direction vector, which acts as their local Z axis, and their local Y axis coplanar with their own direction vector and the common up vector (see common_up_vector). The billboard never rotates to face the camera, you might use double-side mater<br />
For further documentation about the properties see<br />
http://www.ogre3d.org/docs/manual/manual_35.html#SEC191<br />
http://www.ogre3d.org/docs/manual/manual_36.html#SEC208<br />
<br />
{<br />
msg:"Sprite",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
sprite:"http://example.com/test.jpg",//the mesh should be rescaled to fit inside a unit sphere centered at 0,0,0<br />
billboard:"perpendicular_self"<br />
sorted:false//defaults to false--whether the particles should be sorted<br />
up: [0,0,1],///Only required if billboard_type is set to perpendicular_self, this vector is the common up vector used to orient all particles in the system.<br />
}<br />
<br />
===Removing point sprite property from object===<br />
{<br />
msg:"DestroySprite",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
}</div>Rrykhttps://www.sirikata.com/wiki/index.php?title=JavascriptGraphicsAPI&diff=660JavascriptGraphicsAPI2010-08-03T15:43:17Z<p>Rryk: /* Mouse Events */</p>
<hr />
<div>=Rationale for a common API to 3d graphics systems=<br />
<br />
Objects are sent across the thread barrier to alter the current scene graph being displayed--here's why:<br />
<br />
In modern engines, graphics framerates should not be tied to physics framerates and networking events and decoding of said events should happen at the correct pace to keep up with the networknig adapter. Graphics, however, is tied to the DOM and therefore must be on the main thread. This forces both networking and physics to be on webworker threads if there is to be any threading.<br />
<br />
These threads need to advertise state changes to the main graphics thread so that the scene graph may be altered at the graphics rate. This requires that the individual physics and networking threads send timestamped events to the graphics system which drive changes to it. <br />
<br />
Since graphics can run at different rates and the updates from the network may be irregular, the graphics (main) thread needs to have a smooth interpolation scheme interpolating the current position with the timestamped updates sent by the network/physics thread(s). The interpolation scheme should use cubic interpolation using the most current update's position and orientation along with the displayed position and location of the object when that update was received to provide a smooth scheme.<br />
<br />
Below are some example objects that may be sent cross-thread. The objects are listed in JSON format so the type information should be clear from the example.<br />
<br />
=API To Graphics System=<br />
Graphics should provide a constructor method that takes in a callback and a parent DOM element and returns a class that has a send(obj) method that takes in serializable objects from other threads that modify graphics state and an optional "destroy" method which cleans up graphics state in the DOM.<br />
<br />
so a sequence of code to construct an graphics system, make an object, and destroy it could look like<br />
gfx= new GLGERenderer(callbackFunction,parentElement)<br />
gfx.send({ msg:"Create", id:"f47ac10b-58cc-4372-a567-0e02b2c3d479", time: 2181298451298491284, pos:[1,2,3], orient:[.5,0,0,.5]})<br />
gfx.destroy();<br />
<br />
=Cross thread communication from Physics and Networking to graphics=<br />
==Object Management==<br />
id's can be anything from human readable strings to uuids to integers. They just each must be unique and chosen by the user of the API<br />
===Creating a new graphics object===<br />
<br />
{<br />
msg:"Create"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
spaceid:"aaaaaaaa-bbbb-cccc-dada-134234ab98",//<-- optional (defaults to the empty space, 0)<br />
time: 2181298451298491284,//milliseconds since 1970<br />
pos:[1,2,3],<br />
vel:[.25,0,0],<br />
orient:[.5,0,0,.5]<br />
rotaxis:[0,0,1]<br />
rotvel:.25,<br />
scale:[1,1,1],//defaults to 1,1,1 if absent<br />
parent:"c46ac00b-58cc-4372-a567-0e02b2c3d479",//<-- optional (defaults to empty--toplevel if absent<br />
parentbone:"Hand"//name of the bone on the parent object that this is attached to. Assume root transform otherwise<br />
}<br />
<br />
===Moving a graphics object===<br />
//should we define that the graphics system has some sort of interp--otherwise velocity may be useless?<br />
<br />
Should we use "parentbone" or "attachment_point"?<br />
<br />
FIXME: We should default these values to the last position, not to identity. Otherwise we basically have to send everything each time.<br />
<br />
{<br />
msg:"Move"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479"<br />
time:39852398592385,//milliseconds since 1970<br />
pos:[1,2,3],<br />
vel:[.25,0,0],<br />
orient:[.5,0,0,.5],//defaults to (identity) if absent<br />
rotaxis:[0,0,1],//defaults to 0,0,1 if absent, forcing rotvel to 0 <br />
rotvel:.25,//defaults to 0 if absent<br />
scale:[1,1,1],//defaults to 1,1,1 if absent<br />
interpolate:true,//set to false if the object should snap to new position<br />
parent:"c46ac00b-58cc-4372-a567-0e02b2c3d479"//<-- optional (defaults to previous state if absent, to clear pass empty string)<br />
attachment_point:"Hand"//name of the bone on the parent object that this is attached to. Defaults to previous state if absent, to clear pass empty string<br />
}<br />
<br />
===Destroying a graphics object===<br />
<br />
{<br />
msg:"Destroy"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479"<br />
}<br />
<br />
<br />
==Managing object appearance properties==<br />
===Adding/changing mesh property for an object===<br />
{<br />
msg:"Mesh",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
type:"collada",//string that specifies file format<br />
mesh:"http://example.com/test.dae",//the mesh should be rescaled to fit inside a unit sphere centered at 0,0,0<br />
}<br />
<br />
===Updating shader property (Vertex and Fragment float4) for an object===<br />
{<br />
msg:"MeshShaderUniform"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
name:["ColorTint","HowManyIterations"]<br />
value:[[.24,.24,.25,1.0],[1,0,0,0]]<br />
type:"float4"<br />
}<br />
<br />
===Removing mesh property for an object===<br />
{<br />
msg:"DestroyMesh",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
}<br />
<br />
===Adding/changing light property for an object===<br />
<br />
{<br />
msg:"Light"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479"<br />
diffuse_color:[.25,.5,1],<br />
specular_color: [.2,1,.5],<br />
power=1.0: //exponent on the light<br />
ambient_color: [0,0,0],<br />
light_range: 1.0e5<br />
constant_falloff: 0.5,<br />
linear_falloff: 0.2,<br />
quadratic_falloff: 0.1,<br />
cone_inner_radians: 0,<br />
cone_outer_radians: 0,<br />
cone_falloff: 0.5,<br />
type: "POINT",//options include "SPOTLIGHT" or "DIRECTIONAL"<br />
casts_shadow: true,<br />
shader:"http://www.example.com/pointLight.shader"// light shader for ray tracing (empty by default)<br />
}<br />
<br />
===Removing light property for an object===<br />
{<br />
msg:"DestroyLight"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479"<br />
}<br />
<br />
==Camera Management==<br />
<br />
===Creating camera properties on an object===<br />
{<br />
msg:"Camera"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479"<br />
}<br />
<br />
===Destroying and cleaning up a camera===<br />
{<br />
msg:"DestroyCamera"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479"<br />
}<br />
<br />
===Attach a camera to an object's texture"===<br />
{<br />
msg:AttachCamera",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479", // Camera object's id<br />
texobjid:"9a10e9c1-31fb-43e8-9a20-6545d9a62fdb", // Id of object with a mesh<br />
texname:"example.png"//overwrites this texture on the texobjid object.<br />
}<br />
===Attach a camera to a render target"===<br />
{<br />
msg:AttachCamera",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479", // Camera object's id<br />
target:0//writes to this framebuffer--- 0 for left ,1 right for stereo, etc.<br />
}<br />
===Detach a camera from its render target"===<br />
{<br />
msg:DetachCamera",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479", // Camera object's id<br />
}<br />
<br />
==Skeleton Management==<br />
<br />
===Streaming some joint locations===<br />
{<br />
msg:"AnimateBone",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
animation:"uniqueAnimationIdentifier",//so this movement can be associated with one animation and blended with others<br />
//if not specified this is a hard constraint (i.e. foot is glued to a wall in order to avoid penetrating it)<br />
weight:1.0,//the weight for prospective blending, defaults to 1.0<br />
time:1250120951209510295;//milliseconds since 1970<br />
bone:["ankle","arm"]<br />
pos:[[1,2,3],[2,3,4]]<br />
vel:[[.25,0,0],[0,0,0]]<br />
orient:[[.5,0,0,.5],[1,0,0,0]]<br />
rotaxis:[[0,0,1],[0,1,0]]<br />
rotvel:[.25,0],<br />
interpolate:true//if false then the bone should snap to the location unless smooth is set (in which case it should interpolate as quickly as possible) defaults to true<br />
}<br />
<br />
==Event handling==<br />
<br />
===Mouse Events===<br />
<br />
Messages <u>from the graphics system</u> for standard browser events:<br />
{<br />
msg:"mousemove",<br />
x:100,//X coordinate of the mouse, relative to canvas<br />
y:102,//Y coordinate of the mouse, relative to canvas<br />
}<br />
<br />
{<br />
msg:"mousedown",<br />
which:0,//right mouse button = 2, left mouse button = 0<br />
x:100,//X coordinate of the mouse, relative to canvas<br />
y:102,//Y coordinate of the mouse, relative to canvas<br />
}<br />
<br />
{<br />
msg:"mouseup",<br />
which:2,//right mouse button = 2, left mouse button = 0<br />
x:100,//X coordinate of the mouse, relative to canvas<br />
y:102,//Y coordinate of the mouse, relative to canvas<br />
}<br />
<br />
{<br />
msg:"mouseover",<br />
x:100,//X coordinate of the mouse, relative to canvas<br />
y:102,//Y coordinate of the mouse, relative to canvas<br />
}<br />
<br />
{<br />
msg:"mouseout",<br />
x:100,//X coordinate of the mouse, relative to canvas<br />
y:102,//Y coordinate of the mouse, relative to canvas<br />
}<br />
<br />
Despite the fact that click can be derived from mousedown and mouseup, we keep it for consistency with the Web.<br />
{<br />
msg:"click",<br />
which:0,//right mouse button = 2, left mouse button = 0<br />
x:100,//X coordinate of the mouse, relative to canvas<br />
y:102,//Y coordinate of the mouse, relative to canvas<br />
}<br />
<br />
As an alternative to mousemove we introduce pick message that will return additional 3D data:<br />
{<br />
msg:"pick",<br />
x:100,//X coordinate of the mouse, relative to canvas<br />
y:102,//Y coordinate of the mouse, relative to canvas<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",//which object was hit?<br />
pos:[1,2,3],//where on the surface did it hit?<br />
normal:[.5,0,.86],//what the direction of the normal is at that point<br />
}<br />
<br />
To save traffic we allow messages <u>to the graphics system</u> that enable/disable these messages:<br />
{<br />
msg:"enable",<br />
type:"mouseover"//message type<br />
}<br />
<br />
{<br />
msg:"disable",<br />
type:"pick"//message type<br />
}<br />
<br />
All other messages, such as pickover, pickout, drag and others can be derived from these. For example pickover and pickout can be derived from mouseover and pick messages (when object ID under cursor changes that it's a pickover for the new object and pickout for the old one). Another use case is to use mouseover, mouseout to enable/disable picking to reduce network traffic.<br />
<br />
===Keyboard===<br />
<br />
This messages are sent <u>from the graphics system</u> in response to user's actions towards keyboard:<br />
{<br />
msg: "keydown"//key was pressed, but not released<br />
altKey:true,<br />
metaKey:false,<br />
ctrlKey:true,<br />
repeat:false<br />
shiftKey:true<br />
which:25//see keyCode<br />
}<br />
<br />
{<br />
msg: "keydown",//key was released<br />
altKey:true,<br />
metaKey:false,<br />
ctrlKey:true,<br />
repeat:false<br />
shiftKey:true<br />
which:25//see keyCode<br />
}<br />
<br />
For consistency with the web we also allow keypress, despite the fact that it can be derived from keydown and keyup:<br />
{<br />
msg: "keypress",<br />
altKey:true,<br />
metaKey:false,<br />
ctrlKey:true,<br />
repeat:false,//we send repeated message if user holds a key, every message except first one will have this property set to true<br />
shiftKey:true,<br />
which:25//see keyCode<br />
}<br />
<br />
===Requesting intersection===<br />
<br />
Request <u>to the graphics system</u>:<br />
{<br />
msg:"raytrace",<br />
id:5,//request ID<br />
pos:[2,3,4],//origin of the ray<br />
dir:[.24,.33,.5],//direction of the ray<br />
multiple:true//if false, only return first hit, otherwise return all intersections<br />
infinite:false//if false use length of dir to specify ray length<br />
}<br />
<br />
Response <u>from the graphics system</u>:<br />
{<br />
msg:"intersections",<br />
id:5,//request ID<br />
pos:[[2,3,4],[2.23,3.32,4.49]],//positions of the points of intersections<br />
normals:[[0,1,0],[.5,0,.86]],//normals of the surface at intersections points<br />
id:["f47ac10b-58cc-4372-a567-0e02b2c3d479","a33ff133-58dd-2272-dd6a-12aadc31d173",//object IDs for each intersected surface<br />
}<br />
<br />
<br />
=Experimental/Brainstorming ideas for the API=<br />
I decided to reserve a section of the wiki for sort of bleeding edge ideas of cool features that would be nice to have. I could have put that in the "talk" page, but I think it makes more sense here so that it will get wider exposure. These are meant to be things that would help in drawing real scenes and building real VW systems but that we haven't figured out a good API to yet.<br />
<br />
==Attaching UI elements to graphics objects==<br />
The UI will naturally need to be in HTML since that's the best established cross platform, sandboxed UI system.<br />
<br />
The user may specify a 3d location, orientation and scale for a UI dialog to be. The graphics system should do its best to scale and position the UI in the appropriate place, but the UI may be restricted to always face the camera and always be horizontal compared with the bottom of the screen on many system. The UI should not be displayed if it is completely invisible from the camera angle or smaller than 10 pixels.<br />
===Creating/Updating UI Element===<br />
{<br />
msg:"IFrame"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
uri: "http://example.com"<br />
}<br />
<br />
===Destroying UI Element===<br />
{<br />
msg:"DestroyIFrame"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
}<br />
<br />
<br />
It seems like there should be a manner aside from "embedded iframes" to get art defined in the DOM into the scene graph--perhaps the canvas tag is the way to go here? But maybe that's too webGL specific and won't work for an Ogre port of this<br />
<br />
==Attaching 3d Text to an Objects==<br />
I'm just brainstorming here: it seems like WebGL has facilities to do this efficiently, but I don't have a good use case except buildnig a rendering system inside a canvas tag or something?<br />
<br />
Perhaps the canvas tag is the way to go <br />
<br />
{<br />
msg:"Text",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
text:"This is a test of the emergency broadcast system",<br />
font:"size=+1"<br />
}<br />
<br />
{<br />
msg:"DestroyText",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
}<br />
<br />
==Particle System==<br />
===Adding a particle system to an object===<br />
This mimics the ogre interface and we introduce a number of billboard types<br />
point<br />
The default arrangement, this approximates spherical particles and the billboards always fully face the camera. <br />
oriented_common<br />
Particles are oriented around a common, typically fixed direction vector (see common_direction), which acts as their local Y axis. The billboard rotates only around this axis, giving the particle some sense of direction. Good for rainstorms, starfields etc where the particles will traveling in one direction - this is slightly faster than oriented_self (see below). <br />
oriented_self<br />
Particles are oriented around their own direction vector, which acts as their local Y axis. As the particle changes direction, so the billboard reorients itself to face this way. Good for laser fire, fireworks and other 'streaky' particles that should look like they are traveling in their own direction. <br />
perpendicular_common<br />
Particles are perpendicular to a common, typically fixed direction vector (see common_direction), which acts as their local Z axis, and their local Y axis coplanar with common direction and the common up vector (see common_up_vector). The billboard never rotates to face the camera, you might use double-side material to ensure particles never culled by back-facing. Good for aureolas, rings etc where the particles will perpendicular to the ground - this is slightly faster than perpendicular_self (see below). <br />
perpendicular_self<br />
Particles are perpendicular to their own direction vector, which acts as their local Z axis, and their local Y axis coplanar with their own direction vector and the common up vector (see common_up_vector). The billboard never rotates to face the camera, you might use double-side mater<br />
For further documentation about the properties see<br />
http://www.ogre3d.org/docs/manual/manual_35.html#SEC191<br />
http://www.ogre3d.org/docs/manual/manual_36.html#SEC208<br />
and<br />
<br />
<br />
{ <br />
msg:"ParticleSystem",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479"<br />
mesh:"http://example.com/billboard.dae"//the mesh should be rescaled to be a 1x1 mesh with <br />
particle_size:[20,20],<br />
cull_each:false<br />
quota:10000<br />
billboard:"oriented_self",<br />
sorted:false//defaults to false--whether the particles should be sorted<br />
local:false//defaults to false--if true rotation of the node after the emission of the particle will rotate it<br />
direction: [0,0,1],///the common direction for oriented_common or perpendicular_common<br />
up: [0,0,1],///Only required if billboard_type is set to perpendicular_self or perpendicular_common, this vector is the common up vector used to orient all particles in the system.<br />
accurate_facing:false//if the facing is set to the camera facing or calculated per billboard<br />
iteration_interval:.125//how often the particles are updated--if set to 0, defaults to framerate<br />
invisibility_timeout:10//how many seconds of being outside the frustum before the system stops updating<br />
}<br />
<br />
{<br />
msg:"DestroyParticleSystem"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479"<br />
}<br />
<br />
<br />
once a system is created, particles need to be emitted from it. There should be a global map of default emitters named ParticleEmitters consisting of at least<br />
"Point","Box","Cylinder","Ellipsoid","Shell","Ring" and the extra attributes are specified in http://www.ogre3d.org/docs/manual/manual_38.html<br />
{<br />
msg:"ParticleEmitter",//add or upate a particle emitter<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
name:"Flare"<br />
type:"Ring"<br />
angle 15<br />
emission_rate 75<br />
time_to_live:[2.5,3]//range between 2.5 and 3<br />
direction [0, 1, 0]//3d vector<br />
speed:[250,300]//range between 250 and 300<br />
colour_range:[[1 0 0],[0 0 1]]//random color<br />
position:[0,0,0],<br />
repeat_delay:[2.5,5]<br />
<br />
}<br />
<br />
{<br />
msg:"RemoveParticleEmitter",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
name:"Flare"<br />
}<br />
<br />
There may be forces applied to the emitters<br />
and there must be a global map of affectors called ParticleAffector from which the relevent affector is selected consisting of at least<br />
<br />
LinearForce, ColourFader, Scaler, Rotator, ColourInterpolator, ColourImage, DeflectorPlane, DirectionRandomiser<br />
The detailed definitions are contanied at http://www.ogre3d.org/docs/manual/manual_40.html#SEC234<br />
<br />
<br />
<br />
<br />
{<br />
msg:"ParticleAffector",//add or upate a particle emitter<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
name:"TheForce"<br />
type:"LinearForce"<br />
force_vector: [0 -100 0]<br />
force_application: "add"<br />
<br />
}<br />
<br />
{<br />
msg:"RemoveParticleAffector",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
name:"TheForce"<br />
}<br />
<br />
==Specifying a terrain for the world==<br />
Would need to be chunked and in some sort of widely readable format---would be nice to be able to tap into google earth for terrain--ideas for how to do this are still very very early<br />
<br />
=Deprecated API Ideas=<br />
Here we put ideas we had but decided to discard so that they don't come up again as new ideas and may be discussed here and evaluated for re-addition if someone feels strongly they should be included<br />
<br />
<br />
==Skeleton file formats==<br />
<br />
The reason these were removed is that they are too brittle (it's hard to weigh an wave and walk animation and have the steps not be half as wide) and it's difficult to keep the skeletons out of trouble (i.e. feet through the ground)<br />
so we think thta the physics system in general should send the bone positions and timestamps since it's the arbiter of what intersects what--and it can always read the skeleton file format.<br />
<br />
===Animating a skeleton based on a time based animation===<br />
{<br />
msg:"Ani",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
time:489192048120984102,///milliseconds since 1970 that the animation should be started from (skip frames if now is later)<br />
animation:"http://example.com/animation.dae",<br />
loop:false,<br />
weight:1.0 ///how strong this animation should compare with other animations that use the same bones<br />
fadein:2.3 //how many seconds to fade in<br />
}<br />
<br />
Note that the animation.dae should have annotations for loop-in point and loop-out point within the .dae so that loop can intelligently function<br />
===Stopping a skeleton based on a time based animation===<br />
{<br />
msg:"AniStop",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
animation:"http://example.com/animation.dae"<br />
fadeout:1.0//how many seconds to fade out<br />
}<br />
<br />
<br />
==Should an object just be a sprite==<br />
We figured that a collada square file may be a more compat representation for a sprite and can contain the appropriate shader, materials, etc<br />
===Making an object a point sprite===<br />
This mimics the ogre interface and we introduce a number of billboard types<br />
point<br />
The default arrangement, this approximates spherical particles and the billboards always fully face the camera. <br />
oriented_self<br />
Particles are oriented around their own direction vector, which acts as their local Y axis. As the particle changes direction, so the billboard reorients itself to face this way. Good for laser fire, fireworks and other 'streaky' particles that should look like they are traveling in their own direction. <br />
perpendicular_self<br />
Particles are perpendicular to their own direction vector, which acts as their local Z axis, and their local Y axis coplanar with their own direction vector and the common up vector (see common_up_vector). The billboard never rotates to face the camera, you might use double-side mater<br />
For further documentation about the properties see<br />
http://www.ogre3d.org/docs/manual/manual_35.html#SEC191<br />
http://www.ogre3d.org/docs/manual/manual_36.html#SEC208<br />
<br />
{<br />
msg:"Sprite",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
sprite:"http://example.com/test.jpg",//the mesh should be rescaled to fit inside a unit sphere centered at 0,0,0<br />
billboard:"perpendicular_self"<br />
sorted:false//defaults to false--whether the particles should be sorted<br />
up: [0,0,1],///Only required if billboard_type is set to perpendicular_self, this vector is the common up vector used to orient all particles in the system.<br />
}<br />
<br />
===Removing point sprite property from object===<br />
{<br />
msg:"DestroySprite",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
}</div>Rrykhttps://www.sirikata.com/wiki/index.php?title=JavascriptGraphicsAPI&diff=659JavascriptGraphicsAPI2010-08-03T15:42:44Z<p>Rryk: /* Mouse Events */</p>
<hr />
<div>=Rationale for a common API to 3d graphics systems=<br />
<br />
Objects are sent across the thread barrier to alter the current scene graph being displayed--here's why:<br />
<br />
In modern engines, graphics framerates should not be tied to physics framerates and networking events and decoding of said events should happen at the correct pace to keep up with the networknig adapter. Graphics, however, is tied to the DOM and therefore must be on the main thread. This forces both networking and physics to be on webworker threads if there is to be any threading.<br />
<br />
These threads need to advertise state changes to the main graphics thread so that the scene graph may be altered at the graphics rate. This requires that the individual physics and networking threads send timestamped events to the graphics system which drive changes to it. <br />
<br />
Since graphics can run at different rates and the updates from the network may be irregular, the graphics (main) thread needs to have a smooth interpolation scheme interpolating the current position with the timestamped updates sent by the network/physics thread(s). The interpolation scheme should use cubic interpolation using the most current update's position and orientation along with the displayed position and location of the object when that update was received to provide a smooth scheme.<br />
<br />
Below are some example objects that may be sent cross-thread. The objects are listed in JSON format so the type information should be clear from the example.<br />
<br />
=API To Graphics System=<br />
Graphics should provide a constructor method that takes in a callback and a parent DOM element and returns a class that has a send(obj) method that takes in serializable objects from other threads that modify graphics state and an optional "destroy" method which cleans up graphics state in the DOM.<br />
<br />
so a sequence of code to construct an graphics system, make an object, and destroy it could look like<br />
gfx= new GLGERenderer(callbackFunction,parentElement)<br />
gfx.send({ msg:"Create", id:"f47ac10b-58cc-4372-a567-0e02b2c3d479", time: 2181298451298491284, pos:[1,2,3], orient:[.5,0,0,.5]})<br />
gfx.destroy();<br />
<br />
=Cross thread communication from Physics and Networking to graphics=<br />
==Object Management==<br />
id's can be anything from human readable strings to uuids to integers. They just each must be unique and chosen by the user of the API<br />
===Creating a new graphics object===<br />
<br />
{<br />
msg:"Create"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
spaceid:"aaaaaaaa-bbbb-cccc-dada-134234ab98",//<-- optional (defaults to the empty space, 0)<br />
time: 2181298451298491284,//milliseconds since 1970<br />
pos:[1,2,3],<br />
vel:[.25,0,0],<br />
orient:[.5,0,0,.5]<br />
rotaxis:[0,0,1]<br />
rotvel:.25,<br />
scale:[1,1,1],//defaults to 1,1,1 if absent<br />
parent:"c46ac00b-58cc-4372-a567-0e02b2c3d479",//<-- optional (defaults to empty--toplevel if absent<br />
parentbone:"Hand"//name of the bone on the parent object that this is attached to. Assume root transform otherwise<br />
}<br />
<br />
===Moving a graphics object===<br />
//should we define that the graphics system has some sort of interp--otherwise velocity may be useless?<br />
<br />
Should we use "parentbone" or "attachment_point"?<br />
<br />
FIXME: We should default these values to the last position, not to identity. Otherwise we basically have to send everything each time.<br />
<br />
{<br />
msg:"Move"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479"<br />
time:39852398592385,//milliseconds since 1970<br />
pos:[1,2,3],<br />
vel:[.25,0,0],<br />
orient:[.5,0,0,.5],//defaults to (identity) if absent<br />
rotaxis:[0,0,1],//defaults to 0,0,1 if absent, forcing rotvel to 0 <br />
rotvel:.25,//defaults to 0 if absent<br />
scale:[1,1,1],//defaults to 1,1,1 if absent<br />
interpolate:true,//set to false if the object should snap to new position<br />
parent:"c46ac00b-58cc-4372-a567-0e02b2c3d479"//<-- optional (defaults to previous state if absent, to clear pass empty string)<br />
attachment_point:"Hand"//name of the bone on the parent object that this is attached to. Defaults to previous state if absent, to clear pass empty string<br />
}<br />
<br />
===Destroying a graphics object===<br />
<br />
{<br />
msg:"Destroy"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479"<br />
}<br />
<br />
<br />
==Managing object appearance properties==<br />
===Adding/changing mesh property for an object===<br />
{<br />
msg:"Mesh",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
type:"collada",//string that specifies file format<br />
mesh:"http://example.com/test.dae",//the mesh should be rescaled to fit inside a unit sphere centered at 0,0,0<br />
}<br />
<br />
===Updating shader property (Vertex and Fragment float4) for an object===<br />
{<br />
msg:"MeshShaderUniform"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
name:["ColorTint","HowManyIterations"]<br />
value:[[.24,.24,.25,1.0],[1,0,0,0]]<br />
type:"float4"<br />
}<br />
<br />
===Removing mesh property for an object===<br />
{<br />
msg:"DestroyMesh",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
}<br />
<br />
===Adding/changing light property for an object===<br />
<br />
{<br />
msg:"Light"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479"<br />
diffuse_color:[.25,.5,1],<br />
specular_color: [.2,1,.5],<br />
power=1.0: //exponent on the light<br />
ambient_color: [0,0,0],<br />
light_range: 1.0e5<br />
constant_falloff: 0.5,<br />
linear_falloff: 0.2,<br />
quadratic_falloff: 0.1,<br />
cone_inner_radians: 0,<br />
cone_outer_radians: 0,<br />
cone_falloff: 0.5,<br />
type: "POINT",//options include "SPOTLIGHT" or "DIRECTIONAL"<br />
casts_shadow: true,<br />
shader:"http://www.example.com/pointLight.shader"// light shader for ray tracing (empty by default)<br />
}<br />
<br />
===Removing light property for an object===<br />
{<br />
msg:"DestroyLight"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479"<br />
}<br />
<br />
==Camera Management==<br />
<br />
===Creating camera properties on an object===<br />
{<br />
msg:"Camera"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479"<br />
}<br />
<br />
===Destroying and cleaning up a camera===<br />
{<br />
msg:"DestroyCamera"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479"<br />
}<br />
<br />
===Attach a camera to an object's texture"===<br />
{<br />
msg:AttachCamera",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479", // Camera object's id<br />
texobjid:"9a10e9c1-31fb-43e8-9a20-6545d9a62fdb", // Id of object with a mesh<br />
texname:"example.png"//overwrites this texture on the texobjid object.<br />
}<br />
===Attach a camera to a render target"===<br />
{<br />
msg:AttachCamera",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479", // Camera object's id<br />
target:0//writes to this framebuffer--- 0 for left ,1 right for stereo, etc.<br />
}<br />
===Detach a camera from its render target"===<br />
{<br />
msg:DetachCamera",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479", // Camera object's id<br />
}<br />
<br />
==Skeleton Management==<br />
<br />
===Streaming some joint locations===<br />
{<br />
msg:"AnimateBone",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
animation:"uniqueAnimationIdentifier",//so this movement can be associated with one animation and blended with others<br />
//if not specified this is a hard constraint (i.e. foot is glued to a wall in order to avoid penetrating it)<br />
weight:1.0,//the weight for prospective blending, defaults to 1.0<br />
time:1250120951209510295;//milliseconds since 1970<br />
bone:["ankle","arm"]<br />
pos:[[1,2,3],[2,3,4]]<br />
vel:[[.25,0,0],[0,0,0]]<br />
orient:[[.5,0,0,.5],[1,0,0,0]]<br />
rotaxis:[[0,0,1],[0,1,0]]<br />
rotvel:[.25,0],<br />
interpolate:true//if false then the bone should snap to the location unless smooth is set (in which case it should interpolate as quickly as possible) defaults to true<br />
}<br />
<br />
==Event handling==<br />
<br />
===Mouse Events===<br />
<br />
Messages <u>from the graphics system</u> for standard browser events:<br />
{<br />
msg:"mousemove",<br />
x:100,//X coordinate of the mouse, relative to canvas<br />
y:102,//Y coordinate of the mouse, relative to canvas<br />
}<br />
<br />
{<br />
msg:"mousedown",<br />
which:0,//right mouse button = 2, left mouse button = 0<br />
x:100,//X coordinate of the mouse, relative to canvas<br />
y:102,//Y coordinate of the mouse, relative to canvas<br />
}<br />
<br />
{<br />
msg:"mouseup",<br />
which:2,//right mouse button = 2, left mouse button = 0<br />
x:100,//X coordinate of the mouse, relative to canvas<br />
y:102,//Y coordinate of the mouse, relative to canvas<br />
}<br />
<br />
{<br />
msg:"mouseover",<br />
x:100,//X coordinate of the mouse, relative to canvas<br />
y:102,//Y coordinate of the mouse, relative to canvas<br />
}<br />
<br />
{<br />
msg:"mouseout",<br />
x:100,//X coordinate of the mouse, relative to canvas<br />
y:102,//Y coordinate of the mouse, relative to canvas<br />
}<br />
<br />
Despite the fact that click can be derived from mousedown and mouseup, we keep it for consistency with the Web.<br />
{<br />
msg:"click",<br />
which:0,//right mouse button = 2, left mouse button = 0<br />
x:100,//X coordinate of the mouse, relative to canvas<br />
y:102,//Y coordinate of the mouse, relative to canvas<br />
}<br />
<br />
As an alternative to mousemove we introduce pick message that will return additional 3D data:<br />
{<br />
msg:"pick",<br />
x:100,//X coordinate of the mouse, relative to canvas<br />
y:102,//Y coordinate of the mouse, relative to canvas<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",//which object was hit?<br />
pos:[1,2,3],//where on the surface did it hit?<br />
normal:[.5,0,.86],//what the direction of the normal is at that point<br />
}<br />
<br />
To save traffic we allow messages <u>to the graphics system</u> that enable/disable these messages:<br />
{<br />
msg:"enable",<br />
type:"mouseover"//message type<br />
}<br />
<br />
{<br />
msg:"disable",<br />
type:"pick"//message type<br />
}<br />
<br />
All other messages, such as pickover, pickout, drag and others can be derived from these. For example pickover and pickout can be derived from mouseover and pick messages (when object ID under cursor changes that it's a pickover for the new object and pickout for the old one). Alternatively we may use mouseover, mouseout to enable/disable picking to reduce network traffic.<br />
<br />
===Keyboard===<br />
<br />
This messages are sent <u>from the graphics system</u> in response to user's actions towards keyboard:<br />
{<br />
msg: "keydown"//key was pressed, but not released<br />
altKey:true,<br />
metaKey:false,<br />
ctrlKey:true,<br />
repeat:false<br />
shiftKey:true<br />
which:25//see keyCode<br />
}<br />
<br />
{<br />
msg: "keydown",//key was released<br />
altKey:true,<br />
metaKey:false,<br />
ctrlKey:true,<br />
repeat:false<br />
shiftKey:true<br />
which:25//see keyCode<br />
}<br />
<br />
For consistency with the web we also allow keypress, despite the fact that it can be derived from keydown and keyup:<br />
{<br />
msg: "keypress",<br />
altKey:true,<br />
metaKey:false,<br />
ctrlKey:true,<br />
repeat:false,//we send repeated message if user holds a key, every message except first one will have this property set to true<br />
shiftKey:true,<br />
which:25//see keyCode<br />
}<br />
<br />
===Requesting intersection===<br />
<br />
Request <u>to the graphics system</u>:<br />
{<br />
msg:"raytrace",<br />
id:5,//request ID<br />
pos:[2,3,4],//origin of the ray<br />
dir:[.24,.33,.5],//direction of the ray<br />
multiple:true//if false, only return first hit, otherwise return all intersections<br />
infinite:false//if false use length of dir to specify ray length<br />
}<br />
<br />
Response <u>from the graphics system</u>:<br />
{<br />
msg:"intersections",<br />
id:5,//request ID<br />
pos:[[2,3,4],[2.23,3.32,4.49]],//positions of the points of intersections<br />
normals:[[0,1,0],[.5,0,.86]],//normals of the surface at intersections points<br />
id:["f47ac10b-58cc-4372-a567-0e02b2c3d479","a33ff133-58dd-2272-dd6a-12aadc31d173",//object IDs for each intersected surface<br />
}<br />
<br />
<br />
=Experimental/Brainstorming ideas for the API=<br />
I decided to reserve a section of the wiki for sort of bleeding edge ideas of cool features that would be nice to have. I could have put that in the "talk" page, but I think it makes more sense here so that it will get wider exposure. These are meant to be things that would help in drawing real scenes and building real VW systems but that we haven't figured out a good API to yet.<br />
<br />
==Attaching UI elements to graphics objects==<br />
The UI will naturally need to be in HTML since that's the best established cross platform, sandboxed UI system.<br />
<br />
The user may specify a 3d location, orientation and scale for a UI dialog to be. The graphics system should do its best to scale and position the UI in the appropriate place, but the UI may be restricted to always face the camera and always be horizontal compared with the bottom of the screen on many system. The UI should not be displayed if it is completely invisible from the camera angle or smaller than 10 pixels.<br />
===Creating/Updating UI Element===<br />
{<br />
msg:"IFrame"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
uri: "http://example.com"<br />
}<br />
<br />
===Destroying UI Element===<br />
{<br />
msg:"DestroyIFrame"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
}<br />
<br />
<br />
It seems like there should be a manner aside from "embedded iframes" to get art defined in the DOM into the scene graph--perhaps the canvas tag is the way to go here? But maybe that's too webGL specific and won't work for an Ogre port of this<br />
<br />
==Attaching 3d Text to an Objects==<br />
I'm just brainstorming here: it seems like WebGL has facilities to do this efficiently, but I don't have a good use case except buildnig a rendering system inside a canvas tag or something?<br />
<br />
Perhaps the canvas tag is the way to go <br />
<br />
{<br />
msg:"Text",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
text:"This is a test of the emergency broadcast system",<br />
font:"size=+1"<br />
}<br />
<br />
{<br />
msg:"DestroyText",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
}<br />
<br />
==Particle System==<br />
===Adding a particle system to an object===<br />
This mimics the ogre interface and we introduce a number of billboard types<br />
point<br />
The default arrangement, this approximates spherical particles and the billboards always fully face the camera. <br />
oriented_common<br />
Particles are oriented around a common, typically fixed direction vector (see common_direction), which acts as their local Y axis. The billboard rotates only around this axis, giving the particle some sense of direction. Good for rainstorms, starfields etc where the particles will traveling in one direction - this is slightly faster than oriented_self (see below). <br />
oriented_self<br />
Particles are oriented around their own direction vector, which acts as their local Y axis. As the particle changes direction, so the billboard reorients itself to face this way. Good for laser fire, fireworks and other 'streaky' particles that should look like they are traveling in their own direction. <br />
perpendicular_common<br />
Particles are perpendicular to a common, typically fixed direction vector (see common_direction), which acts as their local Z axis, and their local Y axis coplanar with common direction and the common up vector (see common_up_vector). The billboard never rotates to face the camera, you might use double-side material to ensure particles never culled by back-facing. Good for aureolas, rings etc where the particles will perpendicular to the ground - this is slightly faster than perpendicular_self (see below). <br />
perpendicular_self<br />
Particles are perpendicular to their own direction vector, which acts as their local Z axis, and their local Y axis coplanar with their own direction vector and the common up vector (see common_up_vector). The billboard never rotates to face the camera, you might use double-side mater<br />
For further documentation about the properties see<br />
http://www.ogre3d.org/docs/manual/manual_35.html#SEC191<br />
http://www.ogre3d.org/docs/manual/manual_36.html#SEC208<br />
and<br />
<br />
<br />
{ <br />
msg:"ParticleSystem",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479"<br />
mesh:"http://example.com/billboard.dae"//the mesh should be rescaled to be a 1x1 mesh with <br />
particle_size:[20,20],<br />
cull_each:false<br />
quota:10000<br />
billboard:"oriented_self",<br />
sorted:false//defaults to false--whether the particles should be sorted<br />
local:false//defaults to false--if true rotation of the node after the emission of the particle will rotate it<br />
direction: [0,0,1],///the common direction for oriented_common or perpendicular_common<br />
up: [0,0,1],///Only required if billboard_type is set to perpendicular_self or perpendicular_common, this vector is the common up vector used to orient all particles in the system.<br />
accurate_facing:false//if the facing is set to the camera facing or calculated per billboard<br />
iteration_interval:.125//how often the particles are updated--if set to 0, defaults to framerate<br />
invisibility_timeout:10//how many seconds of being outside the frustum before the system stops updating<br />
}<br />
<br />
{<br />
msg:"DestroyParticleSystem"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479"<br />
}<br />
<br />
<br />
once a system is created, particles need to be emitted from it. There should be a global map of default emitters named ParticleEmitters consisting of at least<br />
"Point","Box","Cylinder","Ellipsoid","Shell","Ring" and the extra attributes are specified in http://www.ogre3d.org/docs/manual/manual_38.html<br />
{<br />
msg:"ParticleEmitter",//add or upate a particle emitter<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
name:"Flare"<br />
type:"Ring"<br />
angle 15<br />
emission_rate 75<br />
time_to_live:[2.5,3]//range between 2.5 and 3<br />
direction [0, 1, 0]//3d vector<br />
speed:[250,300]//range between 250 and 300<br />
colour_range:[[1 0 0],[0 0 1]]//random color<br />
position:[0,0,0],<br />
repeat_delay:[2.5,5]<br />
<br />
}<br />
<br />
{<br />
msg:"RemoveParticleEmitter",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
name:"Flare"<br />
}<br />
<br />
There may be forces applied to the emitters<br />
and there must be a global map of affectors called ParticleAffector from which the relevent affector is selected consisting of at least<br />
<br />
LinearForce, ColourFader, Scaler, Rotator, ColourInterpolator, ColourImage, DeflectorPlane, DirectionRandomiser<br />
The detailed definitions are contanied at http://www.ogre3d.org/docs/manual/manual_40.html#SEC234<br />
<br />
<br />
<br />
<br />
{<br />
msg:"ParticleAffector",//add or upate a particle emitter<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
name:"TheForce"<br />
type:"LinearForce"<br />
force_vector: [0 -100 0]<br />
force_application: "add"<br />
<br />
}<br />
<br />
{<br />
msg:"RemoveParticleAffector",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
name:"TheForce"<br />
}<br />
<br />
==Specifying a terrain for the world==<br />
Would need to be chunked and in some sort of widely readable format---would be nice to be able to tap into google earth for terrain--ideas for how to do this are still very very early<br />
<br />
=Deprecated API Ideas=<br />
Here we put ideas we had but decided to discard so that they don't come up again as new ideas and may be discussed here and evaluated for re-addition if someone feels strongly they should be included<br />
<br />
<br />
==Skeleton file formats==<br />
<br />
The reason these were removed is that they are too brittle (it's hard to weigh an wave and walk animation and have the steps not be half as wide) and it's difficult to keep the skeletons out of trouble (i.e. feet through the ground)<br />
so we think thta the physics system in general should send the bone positions and timestamps since it's the arbiter of what intersects what--and it can always read the skeleton file format.<br />
<br />
===Animating a skeleton based on a time based animation===<br />
{<br />
msg:"Ani",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
time:489192048120984102,///milliseconds since 1970 that the animation should be started from (skip frames if now is later)<br />
animation:"http://example.com/animation.dae",<br />
loop:false,<br />
weight:1.0 ///how strong this animation should compare with other animations that use the same bones<br />
fadein:2.3 //how many seconds to fade in<br />
}<br />
<br />
Note that the animation.dae should have annotations for loop-in point and loop-out point within the .dae so that loop can intelligently function<br />
===Stopping a skeleton based on a time based animation===<br />
{<br />
msg:"AniStop",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
animation:"http://example.com/animation.dae"<br />
fadeout:1.0//how many seconds to fade out<br />
}<br />
<br />
<br />
==Should an object just be a sprite==<br />
We figured that a collada square file may be a more compat representation for a sprite and can contain the appropriate shader, materials, etc<br />
===Making an object a point sprite===<br />
This mimics the ogre interface and we introduce a number of billboard types<br />
point<br />
The default arrangement, this approximates spherical particles and the billboards always fully face the camera. <br />
oriented_self<br />
Particles are oriented around their own direction vector, which acts as their local Y axis. As the particle changes direction, so the billboard reorients itself to face this way. Good for laser fire, fireworks and other 'streaky' particles that should look like they are traveling in their own direction. <br />
perpendicular_self<br />
Particles are perpendicular to their own direction vector, which acts as their local Z axis, and their local Y axis coplanar with their own direction vector and the common up vector (see common_up_vector). The billboard never rotates to face the camera, you might use double-side mater<br />
For further documentation about the properties see<br />
http://www.ogre3d.org/docs/manual/manual_35.html#SEC191<br />
http://www.ogre3d.org/docs/manual/manual_36.html#SEC208<br />
<br />
{<br />
msg:"Sprite",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
sprite:"http://example.com/test.jpg",//the mesh should be rescaled to fit inside a unit sphere centered at 0,0,0<br />
billboard:"perpendicular_self"<br />
sorted:false//defaults to false--whether the particles should be sorted<br />
up: [0,0,1],///Only required if billboard_type is set to perpendicular_self, this vector is the common up vector used to orient all particles in the system.<br />
}<br />
<br />
===Removing point sprite property from object===<br />
{<br />
msg:"DestroySprite",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
}</div>Rrykhttps://www.sirikata.com/wiki/index.php?title=JavascriptGraphicsAPI&diff=658JavascriptGraphicsAPI2010-08-03T15:40:54Z<p>Rryk: /* Mouse Events */</p>
<hr />
<div>=Rationale for a common API to 3d graphics systems=<br />
<br />
Objects are sent across the thread barrier to alter the current scene graph being displayed--here's why:<br />
<br />
In modern engines, graphics framerates should not be tied to physics framerates and networking events and decoding of said events should happen at the correct pace to keep up with the networknig adapter. Graphics, however, is tied to the DOM and therefore must be on the main thread. This forces both networking and physics to be on webworker threads if there is to be any threading.<br />
<br />
These threads need to advertise state changes to the main graphics thread so that the scene graph may be altered at the graphics rate. This requires that the individual physics and networking threads send timestamped events to the graphics system which drive changes to it. <br />
<br />
Since graphics can run at different rates and the updates from the network may be irregular, the graphics (main) thread needs to have a smooth interpolation scheme interpolating the current position with the timestamped updates sent by the network/physics thread(s). The interpolation scheme should use cubic interpolation using the most current update's position and orientation along with the displayed position and location of the object when that update was received to provide a smooth scheme.<br />
<br />
Below are some example objects that may be sent cross-thread. The objects are listed in JSON format so the type information should be clear from the example.<br />
<br />
=API To Graphics System=<br />
Graphics should provide a constructor method that takes in a callback and a parent DOM element and returns a class that has a send(obj) method that takes in serializable objects from other threads that modify graphics state and an optional "destroy" method which cleans up graphics state in the DOM.<br />
<br />
so a sequence of code to construct an graphics system, make an object, and destroy it could look like<br />
gfx= new GLGERenderer(callbackFunction,parentElement)<br />
gfx.send({ msg:"Create", id:"f47ac10b-58cc-4372-a567-0e02b2c3d479", time: 2181298451298491284, pos:[1,2,3], orient:[.5,0,0,.5]})<br />
gfx.destroy();<br />
<br />
=Cross thread communication from Physics and Networking to graphics=<br />
==Object Management==<br />
id's can be anything from human readable strings to uuids to integers. They just each must be unique and chosen by the user of the API<br />
===Creating a new graphics object===<br />
<br />
{<br />
msg:"Create"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
spaceid:"aaaaaaaa-bbbb-cccc-dada-134234ab98",//<-- optional (defaults to the empty space, 0)<br />
time: 2181298451298491284,//milliseconds since 1970<br />
pos:[1,2,3],<br />
vel:[.25,0,0],<br />
orient:[.5,0,0,.5]<br />
rotaxis:[0,0,1]<br />
rotvel:.25,<br />
scale:[1,1,1],//defaults to 1,1,1 if absent<br />
parent:"c46ac00b-58cc-4372-a567-0e02b2c3d479",//<-- optional (defaults to empty--toplevel if absent<br />
parentbone:"Hand"//name of the bone on the parent object that this is attached to. Assume root transform otherwise<br />
}<br />
<br />
===Moving a graphics object===<br />
//should we define that the graphics system has some sort of interp--otherwise velocity may be useless?<br />
<br />
Should we use "parentbone" or "attachment_point"?<br />
<br />
FIXME: We should default these values to the last position, not to identity. Otherwise we basically have to send everything each time.<br />
<br />
{<br />
msg:"Move"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479"<br />
time:39852398592385,//milliseconds since 1970<br />
pos:[1,2,3],<br />
vel:[.25,0,0],<br />
orient:[.5,0,0,.5],//defaults to (identity) if absent<br />
rotaxis:[0,0,1],//defaults to 0,0,1 if absent, forcing rotvel to 0 <br />
rotvel:.25,//defaults to 0 if absent<br />
scale:[1,1,1],//defaults to 1,1,1 if absent<br />
interpolate:true,//set to false if the object should snap to new position<br />
parent:"c46ac00b-58cc-4372-a567-0e02b2c3d479"//<-- optional (defaults to previous state if absent, to clear pass empty string)<br />
attachment_point:"Hand"//name of the bone on the parent object that this is attached to. Defaults to previous state if absent, to clear pass empty string<br />
}<br />
<br />
===Destroying a graphics object===<br />
<br />
{<br />
msg:"Destroy"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479"<br />
}<br />
<br />
<br />
==Managing object appearance properties==<br />
===Adding/changing mesh property for an object===<br />
{<br />
msg:"Mesh",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
type:"collada",//string that specifies file format<br />
mesh:"http://example.com/test.dae",//the mesh should be rescaled to fit inside a unit sphere centered at 0,0,0<br />
}<br />
<br />
===Updating shader property (Vertex and Fragment float4) for an object===<br />
{<br />
msg:"MeshShaderUniform"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
name:["ColorTint","HowManyIterations"]<br />
value:[[.24,.24,.25,1.0],[1,0,0,0]]<br />
type:"float4"<br />
}<br />
<br />
===Removing mesh property for an object===<br />
{<br />
msg:"DestroyMesh",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
}<br />
<br />
===Adding/changing light property for an object===<br />
<br />
{<br />
msg:"Light"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479"<br />
diffuse_color:[.25,.5,1],<br />
specular_color: [.2,1,.5],<br />
power=1.0: //exponent on the light<br />
ambient_color: [0,0,0],<br />
light_range: 1.0e5<br />
constant_falloff: 0.5,<br />
linear_falloff: 0.2,<br />
quadratic_falloff: 0.1,<br />
cone_inner_radians: 0,<br />
cone_outer_radians: 0,<br />
cone_falloff: 0.5,<br />
type: "POINT",//options include "SPOTLIGHT" or "DIRECTIONAL"<br />
casts_shadow: true,<br />
shader:"http://www.example.com/pointLight.shader"// light shader for ray tracing (empty by default)<br />
}<br />
<br />
===Removing light property for an object===<br />
{<br />
msg:"DestroyLight"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479"<br />
}<br />
<br />
==Camera Management==<br />
<br />
===Creating camera properties on an object===<br />
{<br />
msg:"Camera"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479"<br />
}<br />
<br />
===Destroying and cleaning up a camera===<br />
{<br />
msg:"DestroyCamera"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479"<br />
}<br />
<br />
===Attach a camera to an object's texture"===<br />
{<br />
msg:AttachCamera",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479", // Camera object's id<br />
texobjid:"9a10e9c1-31fb-43e8-9a20-6545d9a62fdb", // Id of object with a mesh<br />
texname:"example.png"//overwrites this texture on the texobjid object.<br />
}<br />
===Attach a camera to a render target"===<br />
{<br />
msg:AttachCamera",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479", // Camera object's id<br />
target:0//writes to this framebuffer--- 0 for left ,1 right for stereo, etc.<br />
}<br />
===Detach a camera from its render target"===<br />
{<br />
msg:DetachCamera",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479", // Camera object's id<br />
}<br />
<br />
==Skeleton Management==<br />
<br />
===Streaming some joint locations===<br />
{<br />
msg:"AnimateBone",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
animation:"uniqueAnimationIdentifier",//so this movement can be associated with one animation and blended with others<br />
//if not specified this is a hard constraint (i.e. foot is glued to a wall in order to avoid penetrating it)<br />
weight:1.0,//the weight for prospective blending, defaults to 1.0<br />
time:1250120951209510295;//milliseconds since 1970<br />
bone:["ankle","arm"]<br />
pos:[[1,2,3],[2,3,4]]<br />
vel:[[.25,0,0],[0,0,0]]<br />
orient:[[.5,0,0,.5],[1,0,0,0]]<br />
rotaxis:[[0,0,1],[0,1,0]]<br />
rotvel:[.25,0],<br />
interpolate:true//if false then the bone should snap to the location unless smooth is set (in which case it should interpolate as quickly as possible) defaults to true<br />
}<br />
<br />
==Event handling==<br />
<br />
===Mouse Events===<br />
<br />
Messages <u>from the graphics system</u> for standard browser events:<br />
{<br />
msg:"mousemove",<br />
x:100,//X coordinate of the mouse, relative to canvas<br />
y:102,//Y coordinate of the mouse, relative to canvas<br />
}<br />
<br />
{<br />
msg:"mousedown",<br />
which:0,//right mouse button = 2, left mouse button = 0<br />
x:100,//X coordinate of the mouse, relative to canvas<br />
y:102,//Y coordinate of the mouse, relative to canvas<br />
}<br />
<br />
{<br />
msg:"mouseup",<br />
which:2,//right mouse button = 2, left mouse button = 0<br />
x:100,//X coordinate of the mouse, relative to canvas<br />
y:102,//Y coordinate of the mouse, relative to canvas<br />
}<br />
<br />
{<br />
msg:"mouseover",<br />
x:100,//X coordinate of the mouse, relative to canvas<br />
y:102,//Y coordinate of the mouse, relative to canvas<br />
}<br />
<br />
{<br />
msg:"mouseout",<br />
x:100,//X coordinate of the mouse, relative to canvas<br />
y:102,//Y coordinate of the mouse, relative to canvas<br />
}<br />
<br />
Despite the fact that click can be derived from mousedown and mouseup, we keep it for consistency with the Web.<br />
{<br />
msg:"click",<br />
which:0,//right mouse button = 2, left mouse button = 0<br />
x:100,//X coordinate of the mouse, relative to canvas<br />
y:102,//Y coordinate of the mouse, relative to canvas<br />
}<br />
<br />
As an alternative to mousemove we introduce pick message that will return additional 3D data:<br />
{<br />
msg:"pick",<br />
x:100,//X coordinate of the mouse, relative to canvas<br />
y:102,//Y coordinate of the mouse, relative to canvas<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",//which object was hit?<br />
pos:[1,2,3],//where on the surface did it hit?<br />
normal:[.5,0,.86],//what the direction of the normal is at that point<br />
}<br />
<br />
To save traffic we allow messages <u>to the graphics system</u> that enable/disable these messages:<br />
{<br />
msg:"enable",<br />
type:"mouseover"//message type<br />
}<br />
<br />
{<br />
msg:"disable",<br />
type:"pick"//message type<br />
}<br />
<br />
All other messages, such as pickover, pickout, drag and others can be derived from these. For example pickover and pickout can be derived from mouseover and pick messages (when object ID under cursor changes that it's pickover for new object and pickout for old object). Also mouseover, mouseout can be used to enable/disable messages.<br />
<br />
===Keyboard===<br />
<br />
This messages are sent <u>from the graphics system</u> in response to user's actions towards keyboard:<br />
{<br />
msg: "keydown"//key was pressed, but not released<br />
altKey:true,<br />
metaKey:false,<br />
ctrlKey:true,<br />
repeat:false<br />
shiftKey:true<br />
which:25//see keyCode<br />
}<br />
<br />
{<br />
msg: "keydown",//key was released<br />
altKey:true,<br />
metaKey:false,<br />
ctrlKey:true,<br />
repeat:false<br />
shiftKey:true<br />
which:25//see keyCode<br />
}<br />
<br />
For consistency with the web we also allow keypress, despite the fact that it can be derived from keydown and keyup:<br />
{<br />
msg: "keypress",<br />
altKey:true,<br />
metaKey:false,<br />
ctrlKey:true,<br />
repeat:false,//we send repeated message if user holds a key, every message except first one will have this property set to true<br />
shiftKey:true,<br />
which:25//see keyCode<br />
}<br />
<br />
===Requesting intersection===<br />
<br />
Request <u>to the graphics system</u>:<br />
{<br />
msg:"raytrace",<br />
id:5,//request ID<br />
pos:[2,3,4],//origin of the ray<br />
dir:[.24,.33,.5],//direction of the ray<br />
multiple:true//if false, only return first hit, otherwise return all intersections<br />
infinite:false//if false use length of dir to specify ray length<br />
}<br />
<br />
Response <u>from the graphics system</u>:<br />
{<br />
msg:"intersections",<br />
id:5,//request ID<br />
pos:[[2,3,4],[2.23,3.32,4.49]],//positions of the points of intersections<br />
normals:[[0,1,0],[.5,0,.86]],//normals of the surface at intersections points<br />
id:["f47ac10b-58cc-4372-a567-0e02b2c3d479","a33ff133-58dd-2272-dd6a-12aadc31d173",//object IDs for each intersected surface<br />
}<br />
<br />
<br />
=Experimental/Brainstorming ideas for the API=<br />
I decided to reserve a section of the wiki for sort of bleeding edge ideas of cool features that would be nice to have. I could have put that in the "talk" page, but I think it makes more sense here so that it will get wider exposure. These are meant to be things that would help in drawing real scenes and building real VW systems but that we haven't figured out a good API to yet.<br />
<br />
==Attaching UI elements to graphics objects==<br />
The UI will naturally need to be in HTML since that's the best established cross platform, sandboxed UI system.<br />
<br />
The user may specify a 3d location, orientation and scale for a UI dialog to be. The graphics system should do its best to scale and position the UI in the appropriate place, but the UI may be restricted to always face the camera and always be horizontal compared with the bottom of the screen on many system. The UI should not be displayed if it is completely invisible from the camera angle or smaller than 10 pixels.<br />
===Creating/Updating UI Element===<br />
{<br />
msg:"IFrame"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
uri: "http://example.com"<br />
}<br />
<br />
===Destroying UI Element===<br />
{<br />
msg:"DestroyIFrame"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
}<br />
<br />
<br />
It seems like there should be a manner aside from "embedded iframes" to get art defined in the DOM into the scene graph--perhaps the canvas tag is the way to go here? But maybe that's too webGL specific and won't work for an Ogre port of this<br />
<br />
==Attaching 3d Text to an Objects==<br />
I'm just brainstorming here: it seems like WebGL has facilities to do this efficiently, but I don't have a good use case except buildnig a rendering system inside a canvas tag or something?<br />
<br />
Perhaps the canvas tag is the way to go <br />
<br />
{<br />
msg:"Text",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
text:"This is a test of the emergency broadcast system",<br />
font:"size=+1"<br />
}<br />
<br />
{<br />
msg:"DestroyText",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
}<br />
<br />
==Particle System==<br />
===Adding a particle system to an object===<br />
This mimics the ogre interface and we introduce a number of billboard types<br />
point<br />
The default arrangement, this approximates spherical particles and the billboards always fully face the camera. <br />
oriented_common<br />
Particles are oriented around a common, typically fixed direction vector (see common_direction), which acts as their local Y axis. The billboard rotates only around this axis, giving the particle some sense of direction. Good for rainstorms, starfields etc where the particles will traveling in one direction - this is slightly faster than oriented_self (see below). <br />
oriented_self<br />
Particles are oriented around their own direction vector, which acts as their local Y axis. As the particle changes direction, so the billboard reorients itself to face this way. Good for laser fire, fireworks and other 'streaky' particles that should look like they are traveling in their own direction. <br />
perpendicular_common<br />
Particles are perpendicular to a common, typically fixed direction vector (see common_direction), which acts as their local Z axis, and their local Y axis coplanar with common direction and the common up vector (see common_up_vector). The billboard never rotates to face the camera, you might use double-side material to ensure particles never culled by back-facing. Good for aureolas, rings etc where the particles will perpendicular to the ground - this is slightly faster than perpendicular_self (see below). <br />
perpendicular_self<br />
Particles are perpendicular to their own direction vector, which acts as their local Z axis, and their local Y axis coplanar with their own direction vector and the common up vector (see common_up_vector). The billboard never rotates to face the camera, you might use double-side mater<br />
For further documentation about the properties see<br />
http://www.ogre3d.org/docs/manual/manual_35.html#SEC191<br />
http://www.ogre3d.org/docs/manual/manual_36.html#SEC208<br />
and<br />
<br />
<br />
{ <br />
msg:"ParticleSystem",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479"<br />
mesh:"http://example.com/billboard.dae"//the mesh should be rescaled to be a 1x1 mesh with <br />
particle_size:[20,20],<br />
cull_each:false<br />
quota:10000<br />
billboard:"oriented_self",<br />
sorted:false//defaults to false--whether the particles should be sorted<br />
local:false//defaults to false--if true rotation of the node after the emission of the particle will rotate it<br />
direction: [0,0,1],///the common direction for oriented_common or perpendicular_common<br />
up: [0,0,1],///Only required if billboard_type is set to perpendicular_self or perpendicular_common, this vector is the common up vector used to orient all particles in the system.<br />
accurate_facing:false//if the facing is set to the camera facing or calculated per billboard<br />
iteration_interval:.125//how often the particles are updated--if set to 0, defaults to framerate<br />
invisibility_timeout:10//how many seconds of being outside the frustum before the system stops updating<br />
}<br />
<br />
{<br />
msg:"DestroyParticleSystem"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479"<br />
}<br />
<br />
<br />
once a system is created, particles need to be emitted from it. There should be a global map of default emitters named ParticleEmitters consisting of at least<br />
"Point","Box","Cylinder","Ellipsoid","Shell","Ring" and the extra attributes are specified in http://www.ogre3d.org/docs/manual/manual_38.html<br />
{<br />
msg:"ParticleEmitter",//add or upate a particle emitter<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
name:"Flare"<br />
type:"Ring"<br />
angle 15<br />
emission_rate 75<br />
time_to_live:[2.5,3]//range between 2.5 and 3<br />
direction [0, 1, 0]//3d vector<br />
speed:[250,300]//range between 250 and 300<br />
colour_range:[[1 0 0],[0 0 1]]//random color<br />
position:[0,0,0],<br />
repeat_delay:[2.5,5]<br />
<br />
}<br />
<br />
{<br />
msg:"RemoveParticleEmitter",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
name:"Flare"<br />
}<br />
<br />
There may be forces applied to the emitters<br />
and there must be a global map of affectors called ParticleAffector from which the relevent affector is selected consisting of at least<br />
<br />
LinearForce, ColourFader, Scaler, Rotator, ColourInterpolator, ColourImage, DeflectorPlane, DirectionRandomiser<br />
The detailed definitions are contanied at http://www.ogre3d.org/docs/manual/manual_40.html#SEC234<br />
<br />
<br />
<br />
<br />
{<br />
msg:"ParticleAffector",//add or upate a particle emitter<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
name:"TheForce"<br />
type:"LinearForce"<br />
force_vector: [0 -100 0]<br />
force_application: "add"<br />
<br />
}<br />
<br />
{<br />
msg:"RemoveParticleAffector",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
name:"TheForce"<br />
}<br />
<br />
==Specifying a terrain for the world==<br />
Would need to be chunked and in some sort of widely readable format---would be nice to be able to tap into google earth for terrain--ideas for how to do this are still very very early<br />
<br />
=Deprecated API Ideas=<br />
Here we put ideas we had but decided to discard so that they don't come up again as new ideas and may be discussed here and evaluated for re-addition if someone feels strongly they should be included<br />
<br />
<br />
==Skeleton file formats==<br />
<br />
The reason these were removed is that they are too brittle (it's hard to weigh an wave and walk animation and have the steps not be half as wide) and it's difficult to keep the skeletons out of trouble (i.e. feet through the ground)<br />
so we think thta the physics system in general should send the bone positions and timestamps since it's the arbiter of what intersects what--and it can always read the skeleton file format.<br />
<br />
===Animating a skeleton based on a time based animation===<br />
{<br />
msg:"Ani",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
time:489192048120984102,///milliseconds since 1970 that the animation should be started from (skip frames if now is later)<br />
animation:"http://example.com/animation.dae",<br />
loop:false,<br />
weight:1.0 ///how strong this animation should compare with other animations that use the same bones<br />
fadein:2.3 //how many seconds to fade in<br />
}<br />
<br />
Note that the animation.dae should have annotations for loop-in point and loop-out point within the .dae so that loop can intelligently function<br />
===Stopping a skeleton based on a time based animation===<br />
{<br />
msg:"AniStop",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
animation:"http://example.com/animation.dae"<br />
fadeout:1.0//how many seconds to fade out<br />
}<br />
<br />
<br />
==Should an object just be a sprite==<br />
We figured that a collada square file may be a more compat representation for a sprite and can contain the appropriate shader, materials, etc<br />
===Making an object a point sprite===<br />
This mimics the ogre interface and we introduce a number of billboard types<br />
point<br />
The default arrangement, this approximates spherical particles and the billboards always fully face the camera. <br />
oriented_self<br />
Particles are oriented around their own direction vector, which acts as their local Y axis. As the particle changes direction, so the billboard reorients itself to face this way. Good for laser fire, fireworks and other 'streaky' particles that should look like they are traveling in their own direction. <br />
perpendicular_self<br />
Particles are perpendicular to their own direction vector, which acts as their local Z axis, and their local Y axis coplanar with their own direction vector and the common up vector (see common_up_vector). The billboard never rotates to face the camera, you might use double-side mater<br />
For further documentation about the properties see<br />
http://www.ogre3d.org/docs/manual/manual_35.html#SEC191<br />
http://www.ogre3d.org/docs/manual/manual_36.html#SEC208<br />
<br />
{<br />
msg:"Sprite",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
sprite:"http://example.com/test.jpg",//the mesh should be rescaled to fit inside a unit sphere centered at 0,0,0<br />
billboard:"perpendicular_self"<br />
sorted:false//defaults to false--whether the particles should be sorted<br />
up: [0,0,1],///Only required if billboard_type is set to perpendicular_self, this vector is the common up vector used to orient all particles in the system.<br />
}<br />
<br />
===Removing point sprite property from object===<br />
{<br />
msg:"DestroySprite",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
}</div>Rrykhttps://www.sirikata.com/wiki/index.php?title=JavascriptGraphicsAPI&diff=657JavascriptGraphicsAPI2010-08-03T15:39:12Z<p>Rryk: /* Mouse Events */</p>
<hr />
<div>=Rationale for a common API to 3d graphics systems=<br />
<br />
Objects are sent across the thread barrier to alter the current scene graph being displayed--here's why:<br />
<br />
In modern engines, graphics framerates should not be tied to physics framerates and networking events and decoding of said events should happen at the correct pace to keep up with the networknig adapter. Graphics, however, is tied to the DOM and therefore must be on the main thread. This forces both networking and physics to be on webworker threads if there is to be any threading.<br />
<br />
These threads need to advertise state changes to the main graphics thread so that the scene graph may be altered at the graphics rate. This requires that the individual physics and networking threads send timestamped events to the graphics system which drive changes to it. <br />
<br />
Since graphics can run at different rates and the updates from the network may be irregular, the graphics (main) thread needs to have a smooth interpolation scheme interpolating the current position with the timestamped updates sent by the network/physics thread(s). The interpolation scheme should use cubic interpolation using the most current update's position and orientation along with the displayed position and location of the object when that update was received to provide a smooth scheme.<br />
<br />
Below are some example objects that may be sent cross-thread. The objects are listed in JSON format so the type information should be clear from the example.<br />
<br />
=API To Graphics System=<br />
Graphics should provide a constructor method that takes in a callback and a parent DOM element and returns a class that has a send(obj) method that takes in serializable objects from other threads that modify graphics state and an optional "destroy" method which cleans up graphics state in the DOM.<br />
<br />
so a sequence of code to construct an graphics system, make an object, and destroy it could look like<br />
gfx= new GLGERenderer(callbackFunction,parentElement)<br />
gfx.send({ msg:"Create", id:"f47ac10b-58cc-4372-a567-0e02b2c3d479", time: 2181298451298491284, pos:[1,2,3], orient:[.5,0,0,.5]})<br />
gfx.destroy();<br />
<br />
=Cross thread communication from Physics and Networking to graphics=<br />
==Object Management==<br />
id's can be anything from human readable strings to uuids to integers. They just each must be unique and chosen by the user of the API<br />
===Creating a new graphics object===<br />
<br />
{<br />
msg:"Create"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
spaceid:"aaaaaaaa-bbbb-cccc-dada-134234ab98",//<-- optional (defaults to the empty space, 0)<br />
time: 2181298451298491284,//milliseconds since 1970<br />
pos:[1,2,3],<br />
vel:[.25,0,0],<br />
orient:[.5,0,0,.5]<br />
rotaxis:[0,0,1]<br />
rotvel:.25,<br />
scale:[1,1,1],//defaults to 1,1,1 if absent<br />
parent:"c46ac00b-58cc-4372-a567-0e02b2c3d479",//<-- optional (defaults to empty--toplevel if absent<br />
parentbone:"Hand"//name of the bone on the parent object that this is attached to. Assume root transform otherwise<br />
}<br />
<br />
===Moving a graphics object===<br />
//should we define that the graphics system has some sort of interp--otherwise velocity may be useless?<br />
<br />
Should we use "parentbone" or "attachment_point"?<br />
<br />
FIXME: We should default these values to the last position, not to identity. Otherwise we basically have to send everything each time.<br />
<br />
{<br />
msg:"Move"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479"<br />
time:39852398592385,//milliseconds since 1970<br />
pos:[1,2,3],<br />
vel:[.25,0,0],<br />
orient:[.5,0,0,.5],//defaults to (identity) if absent<br />
rotaxis:[0,0,1],//defaults to 0,0,1 if absent, forcing rotvel to 0 <br />
rotvel:.25,//defaults to 0 if absent<br />
scale:[1,1,1],//defaults to 1,1,1 if absent<br />
interpolate:true,//set to false if the object should snap to new position<br />
parent:"c46ac00b-58cc-4372-a567-0e02b2c3d479"//<-- optional (defaults to previous state if absent, to clear pass empty string)<br />
attachment_point:"Hand"//name of the bone on the parent object that this is attached to. Defaults to previous state if absent, to clear pass empty string<br />
}<br />
<br />
===Destroying a graphics object===<br />
<br />
{<br />
msg:"Destroy"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479"<br />
}<br />
<br />
<br />
==Managing object appearance properties==<br />
===Adding/changing mesh property for an object===<br />
{<br />
msg:"Mesh",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
type:"collada",//string that specifies file format<br />
mesh:"http://example.com/test.dae",//the mesh should be rescaled to fit inside a unit sphere centered at 0,0,0<br />
}<br />
<br />
===Updating shader property (Vertex and Fragment float4) for an object===<br />
{<br />
msg:"MeshShaderUniform"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
name:["ColorTint","HowManyIterations"]<br />
value:[[.24,.24,.25,1.0],[1,0,0,0]]<br />
type:"float4"<br />
}<br />
<br />
===Removing mesh property for an object===<br />
{<br />
msg:"DestroyMesh",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
}<br />
<br />
===Adding/changing light property for an object===<br />
<br />
{<br />
msg:"Light"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479"<br />
diffuse_color:[.25,.5,1],<br />
specular_color: [.2,1,.5],<br />
power=1.0: //exponent on the light<br />
ambient_color: [0,0,0],<br />
light_range: 1.0e5<br />
constant_falloff: 0.5,<br />
linear_falloff: 0.2,<br />
quadratic_falloff: 0.1,<br />
cone_inner_radians: 0,<br />
cone_outer_radians: 0,<br />
cone_falloff: 0.5,<br />
type: "POINT",//options include "SPOTLIGHT" or "DIRECTIONAL"<br />
casts_shadow: true,<br />
shader:"http://www.example.com/pointLight.shader"// light shader for ray tracing (empty by default)<br />
}<br />
<br />
===Removing light property for an object===<br />
{<br />
msg:"DestroyLight"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479"<br />
}<br />
<br />
==Camera Management==<br />
<br />
===Creating camera properties on an object===<br />
{<br />
msg:"Camera"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479"<br />
}<br />
<br />
===Destroying and cleaning up a camera===<br />
{<br />
msg:"DestroyCamera"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479"<br />
}<br />
<br />
===Attach a camera to an object's texture"===<br />
{<br />
msg:AttachCamera",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479", // Camera object's id<br />
texobjid:"9a10e9c1-31fb-43e8-9a20-6545d9a62fdb", // Id of object with a mesh<br />
texname:"example.png"//overwrites this texture on the texobjid object.<br />
}<br />
===Attach a camera to a render target"===<br />
{<br />
msg:AttachCamera",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479", // Camera object's id<br />
target:0//writes to this framebuffer--- 0 for left ,1 right for stereo, etc.<br />
}<br />
===Detach a camera from its render target"===<br />
{<br />
msg:DetachCamera",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479", // Camera object's id<br />
}<br />
<br />
==Skeleton Management==<br />
<br />
===Streaming some joint locations===<br />
{<br />
msg:"AnimateBone",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
animation:"uniqueAnimationIdentifier",//so this movement can be associated with one animation and blended with others<br />
//if not specified this is a hard constraint (i.e. foot is glued to a wall in order to avoid penetrating it)<br />
weight:1.0,//the weight for prospective blending, defaults to 1.0<br />
time:1250120951209510295;//milliseconds since 1970<br />
bone:["ankle","arm"]<br />
pos:[[1,2,3],[2,3,4]]<br />
vel:[[.25,0,0],[0,0,0]]<br />
orient:[[.5,0,0,.5],[1,0,0,0]]<br />
rotaxis:[[0,0,1],[0,1,0]]<br />
rotvel:[.25,0],<br />
interpolate:true//if false then the bone should snap to the location unless smooth is set (in which case it should interpolate as quickly as possible) defaults to true<br />
}<br />
<br />
==Event handling==<br />
<br />
===Mouse Events===<br />
<br />
Messages <u>from the graphics system</u> for standard browser events:<br />
{<br />
msg:"mousemove",<br />
x:100,//X coordinate of the mouse, relative to canvas<br />
y:102,//Y coordinate of the mouse, relative to canvas<br />
}<br />
<br />
{<br />
msg:"mousedown",<br />
which:0,//right mouse button = 2, left mouse button = 0<br />
x:100,//X coordinate of the mouse, relative to canvas<br />
y:102,//Y coordinate of the mouse, relative to canvas<br />
}<br />
<br />
{<br />
msg:"mouseup",<br />
which:2,//right mouse button = 2, left mouse button = 0<br />
x:100,//X coordinate of the mouse, relative to canvas<br />
y:102,//Y coordinate of the mouse, relative to canvas<br />
}<br />
<br />
{<br />
msg:"mouseover",<br />
x:100,//X coordinate of the mouse, relative to canvas<br />
y:102,//Y coordinate of the mouse, relative to canvas<br />
}<br />
<br />
{<br />
msg:"mouseout",<br />
x:100,//X coordinate of the mouse, relative to canvas<br />
y:102,//Y coordinate of the mouse, relative to canvas<br />
}<br />
<br />
Despite the fact that click can be derived from mousedown and mouseup, we keep it for consistency with the Web.<br />
{<br />
msg:"click",<br />
which:0,//right mouse button = 2, left mouse button = 0<br />
x:100,//X coordinate of the mouse, relative to canvas<br />
y:102,//Y coordinate of the mouse, relative to canvas<br />
}<br />
<br />
As an alternative to mousemove have pick message that will return additional 3D data:<br />
{<br />
msg:"pick",<br />
x:100,//X coordinate of the mouse, relative to canvas<br />
y:102,//Y coordinate of the mouse, relative to canvas<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",//which object was hit?<br />
pos:[1,2,3],//where on the surface did it hit?<br />
normal:[.5,0,.86],//what the direction of the normal is at that point<br />
}<br />
<br />
To save traffic we allow messages <u>to the graphics system</u> that enable/disable these messages:<br />
{<br />
msg:"enable",<br />
type:"mouseover"//message type<br />
}<br />
<br />
{<br />
msg:"disable",<br />
type:"pick"//message type<br />
}<br />
<br />
All other messages, such as pickover, pickout, drag and others can be derived from these. For example pickover and pickout can be derived from mouseover and pick messages (when object ID under cursor changes that it's pickover for new object and pickout for old object). Also mouseover, mouseout can be used to enable/disable messages.<br />
<br />
===Keyboard===<br />
<br />
This messages are sent <u>from the graphics system</u> in response to user's actions towards keyboard:<br />
{<br />
msg: "keydown"//key was pressed, but not released<br />
altKey:true,<br />
metaKey:false,<br />
ctrlKey:true,<br />
repeat:false<br />
shiftKey:true<br />
which:25//see keyCode<br />
}<br />
<br />
{<br />
msg: "keydown",//key was released<br />
altKey:true,<br />
metaKey:false,<br />
ctrlKey:true,<br />
repeat:false<br />
shiftKey:true<br />
which:25//see keyCode<br />
}<br />
<br />
For consistency with the web we also allow keypress, despite the fact that it can be derived from keydown and keyup:<br />
{<br />
msg: "keypress",<br />
altKey:true,<br />
metaKey:false,<br />
ctrlKey:true,<br />
repeat:false,//we send repeated message if user holds a key, every message except first one will have this property set to true<br />
shiftKey:true,<br />
which:25//see keyCode<br />
}<br />
<br />
===Requesting intersection===<br />
<br />
Request <u>to the graphics system</u>:<br />
{<br />
msg:"raytrace",<br />
id:5,//request ID<br />
pos:[2,3,4],//origin of the ray<br />
dir:[.24,.33,.5],//direction of the ray<br />
multiple:true//if false, only return first hit, otherwise return all intersections<br />
infinite:false//if false use length of dir to specify ray length<br />
}<br />
<br />
Response <u>from the graphics system</u>:<br />
{<br />
msg:"intersections",<br />
id:5,//request ID<br />
pos:[[2,3,4],[2.23,3.32,4.49]],//positions of the points of intersections<br />
normals:[[0,1,0],[.5,0,.86]],//normals of the surface at intersections points<br />
id:["f47ac10b-58cc-4372-a567-0e02b2c3d479","a33ff133-58dd-2272-dd6a-12aadc31d173",//object IDs for each intersected surface<br />
}<br />
<br />
<br />
=Experimental/Brainstorming ideas for the API=<br />
I decided to reserve a section of the wiki for sort of bleeding edge ideas of cool features that would be nice to have. I could have put that in the "talk" page, but I think it makes more sense here so that it will get wider exposure. These are meant to be things that would help in drawing real scenes and building real VW systems but that we haven't figured out a good API to yet.<br />
<br />
==Attaching UI elements to graphics objects==<br />
The UI will naturally need to be in HTML since that's the best established cross platform, sandboxed UI system.<br />
<br />
The user may specify a 3d location, orientation and scale for a UI dialog to be. The graphics system should do its best to scale and position the UI in the appropriate place, but the UI may be restricted to always face the camera and always be horizontal compared with the bottom of the screen on many system. The UI should not be displayed if it is completely invisible from the camera angle or smaller than 10 pixels.<br />
===Creating/Updating UI Element===<br />
{<br />
msg:"IFrame"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
uri: "http://example.com"<br />
}<br />
<br />
===Destroying UI Element===<br />
{<br />
msg:"DestroyIFrame"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
}<br />
<br />
<br />
It seems like there should be a manner aside from "embedded iframes" to get art defined in the DOM into the scene graph--perhaps the canvas tag is the way to go here? But maybe that's too webGL specific and won't work for an Ogre port of this<br />
<br />
==Attaching 3d Text to an Objects==<br />
I'm just brainstorming here: it seems like WebGL has facilities to do this efficiently, but I don't have a good use case except buildnig a rendering system inside a canvas tag or something?<br />
<br />
Perhaps the canvas tag is the way to go <br />
<br />
{<br />
msg:"Text",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
text:"This is a test of the emergency broadcast system",<br />
font:"size=+1"<br />
}<br />
<br />
{<br />
msg:"DestroyText",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
}<br />
<br />
==Particle System==<br />
===Adding a particle system to an object===<br />
This mimics the ogre interface and we introduce a number of billboard types<br />
point<br />
The default arrangement, this approximates spherical particles and the billboards always fully face the camera. <br />
oriented_common<br />
Particles are oriented around a common, typically fixed direction vector (see common_direction), which acts as their local Y axis. The billboard rotates only around this axis, giving the particle some sense of direction. Good for rainstorms, starfields etc where the particles will traveling in one direction - this is slightly faster than oriented_self (see below). <br />
oriented_self<br />
Particles are oriented around their own direction vector, which acts as their local Y axis. As the particle changes direction, so the billboard reorients itself to face this way. Good for laser fire, fireworks and other 'streaky' particles that should look like they are traveling in their own direction. <br />
perpendicular_common<br />
Particles are perpendicular to a common, typically fixed direction vector (see common_direction), which acts as their local Z axis, and their local Y axis coplanar with common direction and the common up vector (see common_up_vector). The billboard never rotates to face the camera, you might use double-side material to ensure particles never culled by back-facing. Good for aureolas, rings etc where the particles will perpendicular to the ground - this is slightly faster than perpendicular_self (see below). <br />
perpendicular_self<br />
Particles are perpendicular to their own direction vector, which acts as their local Z axis, and their local Y axis coplanar with their own direction vector and the common up vector (see common_up_vector). The billboard never rotates to face the camera, you might use double-side mater<br />
For further documentation about the properties see<br />
http://www.ogre3d.org/docs/manual/manual_35.html#SEC191<br />
http://www.ogre3d.org/docs/manual/manual_36.html#SEC208<br />
and<br />
<br />
<br />
{ <br />
msg:"ParticleSystem",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479"<br />
mesh:"http://example.com/billboard.dae"//the mesh should be rescaled to be a 1x1 mesh with <br />
particle_size:[20,20],<br />
cull_each:false<br />
quota:10000<br />
billboard:"oriented_self",<br />
sorted:false//defaults to false--whether the particles should be sorted<br />
local:false//defaults to false--if true rotation of the node after the emission of the particle will rotate it<br />
direction: [0,0,1],///the common direction for oriented_common or perpendicular_common<br />
up: [0,0,1],///Only required if billboard_type is set to perpendicular_self or perpendicular_common, this vector is the common up vector used to orient all particles in the system.<br />
accurate_facing:false//if the facing is set to the camera facing or calculated per billboard<br />
iteration_interval:.125//how often the particles are updated--if set to 0, defaults to framerate<br />
invisibility_timeout:10//how many seconds of being outside the frustum before the system stops updating<br />
}<br />
<br />
{<br />
msg:"DestroyParticleSystem"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479"<br />
}<br />
<br />
<br />
once a system is created, particles need to be emitted from it. There should be a global map of default emitters named ParticleEmitters consisting of at least<br />
"Point","Box","Cylinder","Ellipsoid","Shell","Ring" and the extra attributes are specified in http://www.ogre3d.org/docs/manual/manual_38.html<br />
{<br />
msg:"ParticleEmitter",//add or upate a particle emitter<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
name:"Flare"<br />
type:"Ring"<br />
angle 15<br />
emission_rate 75<br />
time_to_live:[2.5,3]//range between 2.5 and 3<br />
direction [0, 1, 0]//3d vector<br />
speed:[250,300]//range between 250 and 300<br />
colour_range:[[1 0 0],[0 0 1]]//random color<br />
position:[0,0,0],<br />
repeat_delay:[2.5,5]<br />
<br />
}<br />
<br />
{<br />
msg:"RemoveParticleEmitter",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
name:"Flare"<br />
}<br />
<br />
There may be forces applied to the emitters<br />
and there must be a global map of affectors called ParticleAffector from which the relevent affector is selected consisting of at least<br />
<br />
LinearForce, ColourFader, Scaler, Rotator, ColourInterpolator, ColourImage, DeflectorPlane, DirectionRandomiser<br />
The detailed definitions are contanied at http://www.ogre3d.org/docs/manual/manual_40.html#SEC234<br />
<br />
<br />
<br />
<br />
{<br />
msg:"ParticleAffector",//add or upate a particle emitter<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
name:"TheForce"<br />
type:"LinearForce"<br />
force_vector: [0 -100 0]<br />
force_application: "add"<br />
<br />
}<br />
<br />
{<br />
msg:"RemoveParticleAffector",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
name:"TheForce"<br />
}<br />
<br />
==Specifying a terrain for the world==<br />
Would need to be chunked and in some sort of widely readable format---would be nice to be able to tap into google earth for terrain--ideas for how to do this are still very very early<br />
<br />
=Deprecated API Ideas=<br />
Here we put ideas we had but decided to discard so that they don't come up again as new ideas and may be discussed here and evaluated for re-addition if someone feels strongly they should be included<br />
<br />
<br />
==Skeleton file formats==<br />
<br />
The reason these were removed is that they are too brittle (it's hard to weigh an wave and walk animation and have the steps not be half as wide) and it's difficult to keep the skeletons out of trouble (i.e. feet through the ground)<br />
so we think thta the physics system in general should send the bone positions and timestamps since it's the arbiter of what intersects what--and it can always read the skeleton file format.<br />
<br />
===Animating a skeleton based on a time based animation===<br />
{<br />
msg:"Ani",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
time:489192048120984102,///milliseconds since 1970 that the animation should be started from (skip frames if now is later)<br />
animation:"http://example.com/animation.dae",<br />
loop:false,<br />
weight:1.0 ///how strong this animation should compare with other animations that use the same bones<br />
fadein:2.3 //how many seconds to fade in<br />
}<br />
<br />
Note that the animation.dae should have annotations for loop-in point and loop-out point within the .dae so that loop can intelligently function<br />
===Stopping a skeleton based on a time based animation===<br />
{<br />
msg:"AniStop",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
animation:"http://example.com/animation.dae"<br />
fadeout:1.0//how many seconds to fade out<br />
}<br />
<br />
<br />
==Should an object just be a sprite==<br />
We figured that a collada square file may be a more compat representation for a sprite and can contain the appropriate shader, materials, etc<br />
===Making an object a point sprite===<br />
This mimics the ogre interface and we introduce a number of billboard types<br />
point<br />
The default arrangement, this approximates spherical particles and the billboards always fully face the camera. <br />
oriented_self<br />
Particles are oriented around their own direction vector, which acts as their local Y axis. As the particle changes direction, so the billboard reorients itself to face this way. Good for laser fire, fireworks and other 'streaky' particles that should look like they are traveling in their own direction. <br />
perpendicular_self<br />
Particles are perpendicular to their own direction vector, which acts as their local Z axis, and their local Y axis coplanar with their own direction vector and the common up vector (see common_up_vector). The billboard never rotates to face the camera, you might use double-side mater<br />
For further documentation about the properties see<br />
http://www.ogre3d.org/docs/manual/manual_35.html#SEC191<br />
http://www.ogre3d.org/docs/manual/manual_36.html#SEC208<br />
<br />
{<br />
msg:"Sprite",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
sprite:"http://example.com/test.jpg",//the mesh should be rescaled to fit inside a unit sphere centered at 0,0,0<br />
billboard:"perpendicular_self"<br />
sorted:false//defaults to false--whether the particles should be sorted<br />
up: [0,0,1],///Only required if billboard_type is set to perpendicular_self, this vector is the common up vector used to orient all particles in the system.<br />
}<br />
<br />
===Removing point sprite property from object===<br />
{<br />
msg:"DestroySprite",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
}</div>Rrykhttps://www.sirikata.com/wiki/index.php?title=JavascriptGraphicsAPI&diff=656JavascriptGraphicsAPI2010-07-27T09:18:41Z<p>Rryk: </p>
<hr />
<div>=Rationale for a common API to 3d graphics systems=<br />
<br />
Objects are sent across the thread barrier to alter the current scene graph being displayed--here's why:<br />
<br />
In modern engines, graphics framerates should not be tied to physics framerates and networking events and decoding of said events should happen at the correct pace to keep up with the networknig adapter. Graphics, however, is tied to the DOM and therefore must be on the main thread. This forces both networking and physics to be on webworker threads if there is to be any threading.<br />
<br />
These threads need to advertise state changes to the main graphics thread so that the scene graph may be altered at the graphics rate. This requires that the individual physics and networking threads send timestamped events to the graphics system which drive changes to it. <br />
<br />
Since graphics can run at different rates and the updates from the network may be irregular, the graphics (main) thread needs to have a smooth interpolation scheme interpolating the current position with the timestamped updates sent by the network/physics thread(s). The interpolation scheme should use cubic interpolation using the most current update's position and orientation along with the displayed position and location of the object when that update was received to provide a smooth scheme.<br />
<br />
Below are some example objects that may be sent cross-thread. The objects are listed in JSON format so the type information should be clear from the example.<br />
<br />
=API To Graphics System=<br />
Graphics should provide a constructor method that takes in a callback and a parent DOM element and returns a class that has a send(obj) method that takes in serializable objects from other threads that modify graphics state and an optional "destroy" method which cleans up graphics state in the DOM.<br />
<br />
so a sequence of code to construct an graphics system, make an object, and destroy it could look like<br />
gfx= new GLGERenderer(callbackFunction,parentElement)<br />
gfx.send({ msg:"Create", id:"f47ac10b-58cc-4372-a567-0e02b2c3d479", time: 2181298451298491284, pos:[1,2,3], orient:[.5,0,0,.5]})<br />
gfx.destroy();<br />
<br />
=Cross thread communication from Physics and Networking to graphics=<br />
==Object Management==<br />
id's can be anything from human readable strings to uuids to integers. They just each must be unique and chosen by the user of the API<br />
===Creating a new graphics object===<br />
<br />
{<br />
msg:"Create"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
spaceid:"aaaaaaaa-bbbb-cccc-dada-134234ab98",//<-- optional (defaults to the empty space, 0)<br />
time: 2181298451298491284,//milliseconds since 1970<br />
pos:[1,2,3],<br />
vel:[.25,0,0],<br />
orient:[.5,0,0,.5]<br />
rotaxis:[0,0,1]<br />
rotvel:.25,<br />
scale:[1,1,1],//defaults to 1,1,1 if absent<br />
parent:"c46ac00b-58cc-4372-a567-0e02b2c3d479",//<-- optional (defaults to empty--toplevel if absent<br />
parentbone:"Hand"//name of the bone on the parent object that this is attached to. Assume root transform otherwise<br />
}<br />
<br />
===Moving a graphics object===<br />
//should we define that the graphics system has some sort of interp--otherwise velocity may be useless?<br />
<br />
Should we use "parentbone" or "attachment_point"?<br />
<br />
FIXME: We should default these values to the last position, not to identity. Otherwise we basically have to send everything each time.<br />
<br />
{<br />
msg:"Move"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479"<br />
time:39852398592385,//milliseconds since 1970<br />
pos:[1,2,3],<br />
vel:[.25,0,0],<br />
orient:[.5,0,0,.5],//defaults to (identity) if absent<br />
rotaxis:[0,0,1],//defaults to 0,0,1 if absent, forcing rotvel to 0 <br />
rotvel:.25,//defaults to 0 if absent<br />
scale:[1,1,1],//defaults to 1,1,1 if absent<br />
interpolate:true,//set to false if the object should snap to new position<br />
parent:"c46ac00b-58cc-4372-a567-0e02b2c3d479"//<-- optional (defaults to previous state if absent, to clear pass empty string)<br />
attachment_point:"Hand"//name of the bone on the parent object that this is attached to. Defaults to previous state if absent, to clear pass empty string<br />
}<br />
<br />
===Destroying a graphics object===<br />
<br />
{<br />
msg:"Destroy"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479"<br />
}<br />
<br />
<br />
==Managing object appearance properties==<br />
===Adding/changing mesh property for an object===<br />
{<br />
msg:"Mesh",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
type:"collada",//string that specifies file format<br />
mesh:"http://example.com/test.dae",//the mesh should be rescaled to fit inside a unit sphere centered at 0,0,0<br />
}<br />
<br />
===Updating shader property (Vertex and Fragment float4) for an object===<br />
{<br />
msg:"MeshShaderUniform"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
name:["ColorTint","HowManyIterations"]<br />
value:[[.24,.24,.25,1.0],[1,0,0,0]]<br />
type:"float4"<br />
}<br />
<br />
===Removing mesh property for an object===<br />
{<br />
msg:"DestroyMesh",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
}<br />
<br />
===Adding/changing light property for an object===<br />
<br />
{<br />
msg:"Light"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479"<br />
diffuse_color:[.25,.5,1],<br />
specular_color: [.2,1,.5],<br />
power=1.0: //exponent on the light<br />
ambient_color: [0,0,0],<br />
light_range: 1.0e5<br />
constant_falloff: 0.5,<br />
linear_falloff: 0.2,<br />
quadratic_falloff: 0.1,<br />
cone_inner_radians: 0,<br />
cone_outer_radians: 0,<br />
cone_falloff: 0.5,<br />
type: "POINT",//options include "SPOTLIGHT" or "DIRECTIONAL"<br />
casts_shadow: true,<br />
shader:"http://www.example.com/pointLight.shader"// light shader for ray tracing (empty by default)<br />
}<br />
<br />
===Removing light property for an object===<br />
{<br />
msg:"DestroyLight"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479"<br />
}<br />
<br />
==Camera Management==<br />
<br />
===Creating camera properties on an object===<br />
{<br />
msg:"Camera"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479"<br />
}<br />
<br />
===Destroying and cleaning up a camera===<br />
{<br />
msg:"DestroyCamera"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479"<br />
}<br />
<br />
===Attach a camera to an object's texture"===<br />
{<br />
msg:AttachCamera",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479", // Camera object's id<br />
texobjid:"9a10e9c1-31fb-43e8-9a20-6545d9a62fdb", // Id of object with a mesh<br />
texname:"example.png"//overwrites this texture on the texobjid object.<br />
}<br />
===Attach a camera to a render target"===<br />
{<br />
msg:AttachCamera",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479", // Camera object's id<br />
target:0//writes to this framebuffer--- 0 for left ,1 right for stereo, etc.<br />
}<br />
===Detach a camera from its render target"===<br />
{<br />
msg:DetachCamera",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479", // Camera object's id<br />
}<br />
<br />
==Skeleton Management==<br />
<br />
===Streaming some joint locations===<br />
{<br />
msg:"AnimateBone",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
animation:"uniqueAnimationIdentifier",//so this movement can be associated with one animation and blended with others<br />
//if not specified this is a hard constraint (i.e. foot is glued to a wall in order to avoid penetrating it)<br />
weight:1.0,//the weight for prospective blending, defaults to 1.0<br />
time:1250120951209510295;//milliseconds since 1970<br />
bone:["ankle","arm"]<br />
pos:[[1,2,3],[2,3,4]]<br />
vel:[[.25,0,0],[0,0,0]]<br />
orient:[[.5,0,0,.5],[1,0,0,0]]<br />
rotaxis:[[0,0,1],[0,1,0]]<br />
rotvel:[.25,0],<br />
interpolate:true//if false then the bone should snap to the location unless smooth is set (in which case it should interpolate as quickly as possible) defaults to true<br />
}<br />
<br />
==Event handling==<br />
<br />
===Mouse Events===<br />
<br />
Messages <u>from the graphics system</u> for standard browser events:<br />
{<br />
msg:"mousemove",<br />
x:100,//X coordinate of the mouse, relative to canvas<br />
y:102,//Y coordinate of the mouse, relative to canvas<br />
}<br />
<br />
{<br />
msg:"mousedown",<br />
which:0,//right mouse button = 2, left mouse button = 0<br />
x:100,//X coordinate of the mouse, relative to canvas<br />
y:102,//Y coordinate of the mouse, relative to canvas<br />
}<br />
<br />
{<br />
msg:"mouseup",<br />
which:2,//right mouse button = 2, left mouse button = 0<br />
x:100,//X coordinate of the mouse, relative to canvas<br />
y:102,//Y coordinate of the mouse, relative to canvas<br />
}<br />
<br />
{<br />
msg:"mouseover",<br />
x:100,//X coordinate of the mouse, relative to canvas<br />
y:102,//Y coordinate of the mouse, relative to canvas<br />
}<br />
<br />
{<br />
msg:"mouseout",<br />
x:100,//X coordinate of the mouse, relative to canvas<br />
y:102,//Y coordinate of the mouse, relative to canvas<br />
}<br />
<br />
Despite the fact that click can be derived from mousedown and mouse, we keep it for consistency with the Web.<br />
{<br />
msg:"click",<br />
which:0,//right mouse button = 2, left mouse button = 0<br />
x:100,//X coordinate of the mouse, relative to canvas<br />
y:102,//Y coordinate of the mouse, relative to canvas<br />
}<br />
<br />
As an alternative to mousemove have pick message that will return additional 3D data:<br />
{<br />
msg:"pick",<br />
x:100,//X coordinate of the mouse, relative to canvas<br />
y:102,//Y coordinate of the mouse, relative to canvas<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",//which object was hit?<br />
pos:[1,2,3],//where on the surface did it hit?<br />
normal:[.5,0,.86],//what the direction of the normal is at that point<br />
}<br />
<br />
To save traffic we allow messages <u>to the graphics system</u> that enable/disable these messages:<br />
{<br />
msg:"enable",<br />
type:"mouseover"//message type<br />
}<br />
<br />
{<br />
msg:"disable",<br />
type:"pick"//message type<br />
}<br />
<br />
All other messages, such as pickover, pickout, drag and others can be derived from these. For example pickover and pickout can be derived from mouseover and pick messages (when object ID under cursor changes that it's pickover for new object and pickout for old object). Also mouseover, mouseout can be used to enable/disable messages.<br />
<br />
===Keyboard===<br />
<br />
This messages are sent <u>from the graphics system</u> in response to user's actions towards keyboard:<br />
{<br />
msg: "keydown"//key was pressed, but not released<br />
altKey:true,<br />
metaKey:false,<br />
ctrlKey:true,<br />
repeat:false<br />
shiftKey:true<br />
which:25//see keyCode<br />
}<br />
<br />
{<br />
msg: "keydown",//key was released<br />
altKey:true,<br />
metaKey:false,<br />
ctrlKey:true,<br />
repeat:false<br />
shiftKey:true<br />
which:25//see keyCode<br />
}<br />
<br />
For consistency with the web we also allow keypress, despite the fact that it can be derived from keydown and keyup:<br />
{<br />
msg: "keypress",<br />
altKey:true,<br />
metaKey:false,<br />
ctrlKey:true,<br />
repeat:false,//we send repeated message if user holds a key, every message except first one will have this property set to true<br />
shiftKey:true,<br />
which:25//see keyCode<br />
}<br />
<br />
===Requesting intersection===<br />
<br />
Request <u>to the graphics system</u>:<br />
{<br />
msg:"raytrace",<br />
id:5,//request ID<br />
pos:[2,3,4],//origin of the ray<br />
dir:[.24,.33,.5],//direction of the ray<br />
multiple:true//if false, only return first hit, otherwise return all intersections<br />
infinite:false//if false use length of dir to specify ray length<br />
}<br />
<br />
Response <u>from the graphics system</u>:<br />
{<br />
msg:"intersections",<br />
id:5,//request ID<br />
pos:[[2,3,4],[2.23,3.32,4.49]],//positions of the points of intersections<br />
normals:[[0,1,0],[.5,0,.86]],//normals of the surface at intersections points<br />
id:["f47ac10b-58cc-4372-a567-0e02b2c3d479","a33ff133-58dd-2272-dd6a-12aadc31d173",//object IDs for each intersected surface<br />
}<br />
<br />
<br />
=Experimental/Brainstorming ideas for the API=<br />
I decided to reserve a section of the wiki for sort of bleeding edge ideas of cool features that would be nice to have. I could have put that in the "talk" page, but I think it makes more sense here so that it will get wider exposure. These are meant to be things that would help in drawing real scenes and building real VW systems but that we haven't figured out a good API to yet.<br />
<br />
==Attaching UI elements to graphics objects==<br />
The UI will naturally need to be in HTML since that's the best established cross platform, sandboxed UI system.<br />
<br />
The user may specify a 3d location, orientation and scale for a UI dialog to be. The graphics system should do its best to scale and position the UI in the appropriate place, but the UI may be restricted to always face the camera and always be horizontal compared with the bottom of the screen on many system. The UI should not be displayed if it is completely invisible from the camera angle or smaller than 10 pixels.<br />
===Creating/Updating UI Element===<br />
{<br />
msg:"IFrame"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
uri: "http://example.com"<br />
}<br />
<br />
===Destroying UI Element===<br />
{<br />
msg:"DestroyIFrame"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
}<br />
<br />
<br />
It seems like there should be a manner aside from "embedded iframes" to get art defined in the DOM into the scene graph--perhaps the canvas tag is the way to go here? But maybe that's too webGL specific and won't work for an Ogre port of this<br />
<br />
==Attaching 3d Text to an Objects==<br />
I'm just brainstorming here: it seems like WebGL has facilities to do this efficiently, but I don't have a good use case except buildnig a rendering system inside a canvas tag or something?<br />
<br />
Perhaps the canvas tag is the way to go <br />
<br />
{<br />
msg:"Text",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
text:"This is a test of the emergency broadcast system",<br />
font:"size=+1"<br />
}<br />
<br />
{<br />
msg:"DestroyText",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
}<br />
<br />
==Particle System==<br />
===Adding a particle system to an object===<br />
This mimics the ogre interface and we introduce a number of billboard types<br />
point<br />
The default arrangement, this approximates spherical particles and the billboards always fully face the camera. <br />
oriented_common<br />
Particles are oriented around a common, typically fixed direction vector (see common_direction), which acts as their local Y axis. The billboard rotates only around this axis, giving the particle some sense of direction. Good for rainstorms, starfields etc where the particles will traveling in one direction - this is slightly faster than oriented_self (see below). <br />
oriented_self<br />
Particles are oriented around their own direction vector, which acts as their local Y axis. As the particle changes direction, so the billboard reorients itself to face this way. Good for laser fire, fireworks and other 'streaky' particles that should look like they are traveling in their own direction. <br />
perpendicular_common<br />
Particles are perpendicular to a common, typically fixed direction vector (see common_direction), which acts as their local Z axis, and their local Y axis coplanar with common direction and the common up vector (see common_up_vector). The billboard never rotates to face the camera, you might use double-side material to ensure particles never culled by back-facing. Good for aureolas, rings etc where the particles will perpendicular to the ground - this is slightly faster than perpendicular_self (see below). <br />
perpendicular_self<br />
Particles are perpendicular to their own direction vector, which acts as their local Z axis, and their local Y axis coplanar with their own direction vector and the common up vector (see common_up_vector). The billboard never rotates to face the camera, you might use double-side mater<br />
For further documentation about the properties see<br />
http://www.ogre3d.org/docs/manual/manual_35.html#SEC191<br />
http://www.ogre3d.org/docs/manual/manual_36.html#SEC208<br />
and<br />
<br />
<br />
{ <br />
msg:"ParticleSystem",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479"<br />
mesh:"http://example.com/billboard.dae"//the mesh should be rescaled to be a 1x1 mesh with <br />
particle_size:[20,20],<br />
cull_each:false<br />
quota:10000<br />
billboard:"oriented_self",<br />
sorted:false//defaults to false--whether the particles should be sorted<br />
local:false//defaults to false--if true rotation of the node after the emission of the particle will rotate it<br />
direction: [0,0,1],///the common direction for oriented_common or perpendicular_common<br />
up: [0,0,1],///Only required if billboard_type is set to perpendicular_self or perpendicular_common, this vector is the common up vector used to orient all particles in the system.<br />
accurate_facing:false//if the facing is set to the camera facing or calculated per billboard<br />
iteration_interval:.125//how often the particles are updated--if set to 0, defaults to framerate<br />
invisibility_timeout:10//how many seconds of being outside the frustum before the system stops updating<br />
}<br />
<br />
{<br />
msg:"DestroyParticleSystem"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479"<br />
}<br />
<br />
<br />
once a system is created, particles need to be emitted from it. There should be a global map of default emitters named ParticleEmitters consisting of at least<br />
"Point","Box","Cylinder","Ellipsoid","Shell","Ring" and the extra attributes are specified in http://www.ogre3d.org/docs/manual/manual_38.html<br />
{<br />
msg:"ParticleEmitter",//add or upate a particle emitter<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
name:"Flare"<br />
type:"Ring"<br />
angle 15<br />
emission_rate 75<br />
time_to_live:[2.5,3]//range between 2.5 and 3<br />
direction [0, 1, 0]//3d vector<br />
speed:[250,300]//range between 250 and 300<br />
colour_range:[[1 0 0],[0 0 1]]//random color<br />
position:[0,0,0],<br />
repeat_delay:[2.5,5]<br />
<br />
}<br />
<br />
{<br />
msg:"RemoveParticleEmitter",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
name:"Flare"<br />
}<br />
<br />
There may be forces applied to the emitters<br />
and there must be a global map of affectors called ParticleAffector from which the relevent affector is selected consisting of at least<br />
<br />
LinearForce, ColourFader, Scaler, Rotator, ColourInterpolator, ColourImage, DeflectorPlane, DirectionRandomiser<br />
The detailed definitions are contanied at http://www.ogre3d.org/docs/manual/manual_40.html#SEC234<br />
<br />
<br />
<br />
<br />
{<br />
msg:"ParticleAffector",//add or upate a particle emitter<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
name:"TheForce"<br />
type:"LinearForce"<br />
force_vector: [0 -100 0]<br />
force_application: "add"<br />
<br />
}<br />
<br />
{<br />
msg:"RemoveParticleAffector",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
name:"TheForce"<br />
}<br />
<br />
==Specifying a terrain for the world==<br />
Would need to be chunked and in some sort of widely readable format---would be nice to be able to tap into google earth for terrain--ideas for how to do this are still very very early<br />
<br />
=Deprecated API Ideas=<br />
Here we put ideas we had but decided to discard so that they don't come up again as new ideas and may be discussed here and evaluated for re-addition if someone feels strongly they should be included<br />
<br />
<br />
==Skeleton file formats==<br />
<br />
The reason these were removed is that they are too brittle (it's hard to weigh an wave and walk animation and have the steps not be half as wide) and it's difficult to keep the skeletons out of trouble (i.e. feet through the ground)<br />
so we think thta the physics system in general should send the bone positions and timestamps since it's the arbiter of what intersects what--and it can always read the skeleton file format.<br />
<br />
===Animating a skeleton based on a time based animation===<br />
{<br />
msg:"Ani",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
time:489192048120984102,///milliseconds since 1970 that the animation should be started from (skip frames if now is later)<br />
animation:"http://example.com/animation.dae",<br />
loop:false,<br />
weight:1.0 ///how strong this animation should compare with other animations that use the same bones<br />
fadein:2.3 //how many seconds to fade in<br />
}<br />
<br />
Note that the animation.dae should have annotations for loop-in point and loop-out point within the .dae so that loop can intelligently function<br />
===Stopping a skeleton based on a time based animation===<br />
{<br />
msg:"AniStop",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
animation:"http://example.com/animation.dae"<br />
fadeout:1.0//how many seconds to fade out<br />
}<br />
<br />
<br />
==Should an object just be a sprite==<br />
We figured that a collada square file may be a more compat representation for a sprite and can contain the appropriate shader, materials, etc<br />
===Making an object a point sprite===<br />
This mimics the ogre interface and we introduce a number of billboard types<br />
point<br />
The default arrangement, this approximates spherical particles and the billboards always fully face the camera. <br />
oriented_self<br />
Particles are oriented around their own direction vector, which acts as their local Y axis. As the particle changes direction, so the billboard reorients itself to face this way. Good for laser fire, fireworks and other 'streaky' particles that should look like they are traveling in their own direction. <br />
perpendicular_self<br />
Particles are perpendicular to their own direction vector, which acts as their local Z axis, and their local Y axis coplanar with their own direction vector and the common up vector (see common_up_vector). The billboard never rotates to face the camera, you might use double-side mater<br />
For further documentation about the properties see<br />
http://www.ogre3d.org/docs/manual/manual_35.html#SEC191<br />
http://www.ogre3d.org/docs/manual/manual_36.html#SEC208<br />
<br />
{<br />
msg:"Sprite",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
sprite:"http://example.com/test.jpg",//the mesh should be rescaled to fit inside a unit sphere centered at 0,0,0<br />
billboard:"perpendicular_self"<br />
sorted:false//defaults to false--whether the particles should be sorted<br />
up: [0,0,1],///Only required if billboard_type is set to perpendicular_self, this vector is the common up vector used to orient all particles in the system.<br />
}<br />
<br />
===Removing point sprite property from object===<br />
{<br />
msg:"DestroySprite",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
}</div>Rrykhttps://www.sirikata.com/wiki/index.php?title=JavascriptGraphicsAPI&diff=655JavascriptGraphicsAPI2010-07-27T08:42:44Z<p>Rryk: Undo revision 651 by AmandaCusack (Talk), spam</p>
<hr />
<div>=Rationale for a common API to 3d graphics systems=<br />
<br />
Objects are sent across the thread barrier to alter the current scene graph being displayed--here's why:<br />
<br />
In modern engines, graphics framerates should not be tied to physics framerates and networking events and decoding of said events should happen at the correct pace to keep up with the networknig adapter. Graphics, however, is tied to the DOM and therefore must be on the main thread. This forces both networking and physics to be on webworker threads if there is to be any threading.<br />
<br />
These threads need to advertise state changes to the main graphics thread so that the scene graph may be altered at the graphics rate. This requires that the individual physics and networking threads send timestamped events to the graphics system which drive changes to it. <br />
<br />
Since graphics can run at different rates and the updates from the network may be irregular, the graphics (main) thread needs to have a smooth interpolation scheme interpolating the current position with the timestamped updates sent by the network/physics thread(s). The interpolation scheme should use cubic interpolation using the most current update's position and orientation along with the displayed position and location of the object when that update was received to provide a smooth scheme.<br />
<br />
Below are some example objects that may be sent cross-thread. The objects are listed in JSON format so the type information should be clear from the example.<br />
<br />
=API To Graphics System=<br />
Graphics should provide a constructor method that takes in a callback and a parent DOM element and returns a class that has a send(obj) method that takes in serializable objects from other threads that modify graphics state and an optional "destroy" method which cleans up graphics state in the DOM.<br />
<br />
so a sequence of code to construct an graphics system, make an object, and destroy it could look like<br />
gfx= new GLGERenderer(callbackFunction,parentElement)<br />
gfx.send({ msg:"Create", id:"f47ac10b-58cc-4372-a567-0e02b2c3d479", time: 2181298451298491284, pos:[1,2,3], orient:[.5,0,0,.5]})<br />
gfx.destroy();<br />
<br />
=Cross thread communication from Physics and Networking to graphics=<br />
==Object Management==<br />
id's can be anything from human readable strings to uuids to integers. They just each must be unique and chosen by the user of the API<br />
===Creating a new graphics object===<br />
<br />
{<br />
msg:"Create"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
spaceid:"aaaaaaaa-bbbb-cccc-dada-134234ab98",//<-- optional (defaults to the empty space, 0)<br />
time: 2181298451298491284,//milliseconds since 1970<br />
pos:[1,2,3],<br />
vel:[.25,0,0],<br />
orient:[.5,0,0,.5]<br />
rotaxis:[0,0,1]<br />
rotvel:.25,<br />
scale:[1,1,1],//defaults to 1,1,1 if absent<br />
parent:"c46ac00b-58cc-4372-a567-0e02b2c3d479",//<-- optional (defaults to empty--toplevel if absent<br />
parentbone:"Hand"//name of the bone on the parent object that this is attached to. Assume root transform otherwise<br />
}<br />
<br />
===Moving a graphics object===<br />
//should we define that the graphics system has some sort of interp--otherwise velocity may be useless?<br />
<br />
Should we use "parentbone" or "attachment_point"?<br />
<br />
FIXME: We should default these values to the last position, not to identity. Otherwise we basically have to send everything each time.<br />
<br />
{<br />
msg:"Move"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479"<br />
time:39852398592385,//milliseconds since 1970<br />
pos:[1,2,3],<br />
vel:[.25,0,0],<br />
orient:[.5,0,0,.5],//defaults to (identity) if absent<br />
rotaxis:[0,0,1],//defaults to 0,0,1 if absent, forcing rotvel to 0 <br />
rotvel:.25,//defaults to 0 if absent<br />
scale:[1,1,1],//defaults to 1,1,1 if absent<br />
interpolate:true,//set to false if the object should snap to new position<br />
parent:"c46ac00b-58cc-4372-a567-0e02b2c3d479"//<-- optional (defaults to previous state if absent, to clear pass empty string)<br />
attachment_point:"Hand"//name of the bone on the parent object that this is attached to. Defaults to previous state if absent, to clear pass empty string<br />
}<br />
<br />
===Destroying a graphics object===<br />
<br />
{<br />
msg:"Destroy"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479"<br />
}<br />
<br />
<br />
==Managing object appearance properties==<br />
===Adding/changing mesh property for an object===<br />
{<br />
msg:"Mesh",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
mesh:"http://example.com/test.dae",//the mesh should be rescaled to fit inside a unit sphere centered at 0,0,0<br />
}<br />
<br />
===Updating shader property (Vertex and Fragment float4) for an object===<br />
{<br />
msg:"MeshShaderUniform"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
name:["ColorTint","HowManyIterations"]<br />
value:[[.24,.24,.25,1.0],[1,0,0,0]]<br />
type:"float4"<br />
}<br />
<br />
===Removing mesh property for an object===<br />
{<br />
msg:"DestroyMesh",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
}<br />
<br />
===Adding/changing light property for an object===<br />
<br />
{<br />
msg:"Light"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479"<br />
diffuse_color:[.25,.5,1],<br />
specular_color: [.2,1,.5],<br />
power=1.0: //exponent on the light<br />
ambient_color: [0,0,0],<br />
light_range: 1.0e5<br />
constant_falloff: 0.5,<br />
linear_falloff: 0.2,<br />
quadratic_falloff: 0.1,<br />
cone_inner_radians: 0,<br />
cone_outer_radians: 0,<br />
cone_falloff: 0.5,<br />
type: "POINT",//options include "SPOTLIGHT" or "DIRECTIONAL"<br />
casts_shadow: true<br />
}<br />
<br />
===Removing light property for an object===<br />
{<br />
msg:"DestroyLight"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479"<br />
}<br />
<br />
==Camera Management==<br />
<br />
===Creating camera properties on an object===<br />
{<br />
msg:"Camera"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479"<br />
}<br />
<br />
===Destroying and cleaning up a camera===<br />
{<br />
msg:"DestroyCamera"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479"<br />
}<br />
<br />
===Attach a camera to an object's texture"===<br />
{<br />
msg:AttachCamera",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479", // Camera object's id<br />
texobjid:"9a10e9c1-31fb-43e8-9a20-6545d9a62fdb", // Id of object with a mesh<br />
texname:"example.png"//overwrites this texture on the texobjid object.<br />
}<br />
===Attach a camera to a render target"===<br />
{<br />
msg:AttachCamera",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479", // Camera object's id<br />
target:0//writes to this framebuffer--- 0 for left ,1 right for stereo, etc.<br />
}<br />
===Detach a camera from its render target"===<br />
{<br />
msg:DetachCamera",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479", // Camera object's id<br />
}<br />
<br />
==Skeleton Management==<br />
<br />
===Streaming some joint locations===<br />
{<br />
msg:"AnimateBone",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
animation:"uniqueAnimationIdentifier",//so this movement can be associated with one animation and blended with others<br />
//if not specified this is a hard constraint (i.e. foot is glued to a wall in order to avoid penetrating it)<br />
weight:1.0,//the weight for prospective blending, defaults to 1.0<br />
time:1250120951209510295;//milliseconds since 1970<br />
bone:["ankle","arm"]<br />
pos:[[1,2,3],[2,3,4]]<br />
vel:[[.25,0,0],[0,0,0]]<br />
orient:[[.5,0,0,.5],[1,0,0,0]]<br />
rotaxis:[[0,0,1],[0,1,0]]<br />
rotvel:[.25,0],<br />
interpolate:true//if false then the bone should snap to the location unless smooth is set (in which case it should interpolate as quickly as possible) defaults to true<br />
}<br />
<br />
==Scene Queries==<br />
===Clicking===<br />
The standard javascript callbacks are always enabled... you will get the following messages from time to time on the receiving stream.<br />
<br />
{<br />
msg:"onclick",<br />
which:2,//right mouse button = 2, left mouse button = 0<br />
x:100,<br />
y:102<br />
}<br />
<br />
every onclick will produce a matching onrelease, even if the mouse is outside the window<br />
{<br />
msg:"onrelease",<br />
which:2,//right mouse button = 2, left mouse button = 0<br />
x:100,<br />
y:102<br />
}<br />
<br />
{<br />
msg:"drag",<br />
queryid:8//messages will come back with an id field set the same<br />
x:103,<br />
y:105<br />
}<br />
<br />
===MouseMove===<br />
{<br />
msg:"EnableMouseMove",<br />
queryid:8,//messages will come back with an id field set the same<br />
drag:false//defaults to false...if true, then deltas are sent until up is received?<br />
}<br />
{<br />
msg:"DisableMouseMove",<br />
queryid:8,//messages will come back with an id field set the same<br />
drag:false//defaults to false...if true, then deltas are sent until up is received?<br />
}<br />
<br />
<br />
{<br />
msg:"mousemove",<br />
x:100,<br />
y:102<br />
}<br />
<br />
{<br />
msg:"EnableMouseDownUp",<br />
queryid:8,//messages will come back with an id field set the same<br />
drag:false//defaults to false...if true, then deltas are sent until up is received?<br />
}<br />
<br />
{<br />
msg:"mousedown",<br />
which:2,//right mouse button = 2, left mouse button = 0<br />
x:100,<br />
y:102<br />
}<br />
{<br />
msg:"mouseup",<br />
x:100,<br />
y:102<br />
}<br />
<br />
<br />
===Keyboard===<br />
<br />
{<br />
msg: "keypress"<br />
altKey:true,<br />
metaKey:false,<br />
ctrlKey:true,<br />
repeat:false<br />
shiftKey:true<br />
which:25//see keyCode<br />
}<br />
{<br />
msg: "keydown"<br />
altKey:true,<br />
metaKey:false,<br />
ctrlKey:true,<br />
repeat:false<br />
shiftKey:true<br />
which:25//see keyCode<br />
}<br />
{<br />
msg: "keyup"<br />
altKey:true,<br />
metaKey:false,<br />
ctrlKey:true,<br />
repeat:false<br />
shiftKey:true<br />
which:25//see keyCode<br />
}<br />
<br />
===Picking===<br />
{<br />
msg:"EnablePicking",<br />
queryid:8,//messages will come back with an id field set the same<br />
drag:false//defaults to false...if true, then deltas are sent until up is received?<br />
}<br />
{<br />
msg:"DisablePicking",<br />
queryid:8,//messages will come back with an id field set the same<br />
drag:false//defaults to false...if true, then deltas are sent until up is received?<br />
}<br />
<br />
<br />
Responses look like<br />
{<br />
msg:"Pick",<br />
queryid:8//messages will come back with an id field set the same<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
pos:[1,2,3],//where on the surface did it hit?<br />
normal:[.5,0,.86],//what the direction of the normal is at that point<br />
x: 100, //relative to canvas<br />
y: 200<br />
}<br />
{<br />
msg:"PickDrag",<br />
queryid:8//messages will come back with an id field set the same<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
pos:[1,2,3],//where on the surface did the drag FIRST hit?<br />
normal:[.5,0,.86],//what the direction of the normal is at that point<br />
planedelta:[.24,.52,.25]//how far did it move in the camera plane<br />
perpdelta:[0,1,0]//how far did it move perpendicular to camera plane<br />
x: 100, // original pixel coordinates of mousedown, relative to canvas<br />
y: 200,<br />
xdelta: 15, // delta from original pixel coordinates.<br />
ydelta: 3<br />
}<br />
{<br />
msg:"PickRelease",<br />
queryid:8//messages will come back with an id field set the same<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
pos:[1,2,3],//where on the surface did the drag FIRST hit?<br />
normal:[.5,0,.86],//what the direction of the normal is at that point<br />
planedelta:[.24,.52,.25]//how far did it move in the camera plane<br />
perpdelta:[0,1,0]//how far did it move perpendicular to camera plane<br />
x: 100, // original pixel coordinates of mousedown, relative to canvas<br />
y: 200,<br />
xdelta: 15, // delta from original pixel coordinates.<br />
ydelta: 3<br />
}<br />
<br />
===Pick/Hovering==<br />
{<br />
msg:"EnablePickHover",<br />
queryid:9,//messages will come back with an id field set the same<br />
approximate:true//defaults to true...if true, then only rough bounds are used<br />
}<br />
{<br />
msg:"DisablePickHover",<br />
queryid:9,//messages will come back with an id field set the same<br />
approximate:true//defaults to true...if true, then only rough bounds are used<br />
}<br />
Responses look like<br />
{<br />
msg:"HoverFocus",<br />
queryid:9//messages will come back with an id field set the same<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
pos:[1,2,3],//approx where on the surface did it hit?<br />
x:100,//screen coords<br />
y:102<br />
}<br />
{<br />
msg:"HoverBlur",<br />
queryid:9//messages will come back with an id field set the same<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
x:100,//screen coords<br />
y:102<br />
}<br />
<br />
<br />
<br />
===Requesting intersection===<br />
{<br />
msg:"RayTrace",<br />
id:5<br />
pos:[2,3,4],<br />
dir:[.24,.33,.5],<br />
multiple:true//if false, only return first hit<br />
infinite:false//if false use length of dir to specify ray length<br />
}<br />
<br />
===Intersection callback===<br />
{<br />
msg:"Intersections",<br />
id:5<br />
pos:[[2,3,4],[2.23,3.32,4.49]]<br />
normals:[[0,1,0],[.5,0,.86]]<br />
id:["f47ac10b-58cc-4372-a567-0e02b2c3d479","a33ff133-58dd-2272-dd6a-12aadc31d173",<br />
}<br />
<br />
<br />
=Experimental/Brainstorming ideas for the API=<br />
I decided to reserve a section of the wiki for sort of bleeding edge ideas of cool features that would be nice to have. I could have put that in the "talk" page, but I think it makes more sense here so that it will get wider exposure. These are meant to be things that would help in drawing real scenes and building real VW systems but that we haven't figured out a good API to yet.<br />
<br />
==Attaching UI elements to graphics objects==<br />
The UI will naturally need to be in HTML since that's the best established cross platform, sandboxed UI system.<br />
<br />
The user may specify a 3d location, orientation and scale for a UI dialog to be. The graphics system should do its best to scale and position the UI in the appropriate place, but the UI may be restricted to always face the camera and always be horizontal compared with the bottom of the screen on many system. The UI should not be displayed if it is completely invisible from the camera angle or smaller than 10 pixels.<br />
===Creating/Updating UI Element===<br />
{<br />
msg:"IFrame"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
uri: "http://example.com"<br />
}<br />
<br />
===Destroying UI Element===<br />
{<br />
msg:"DestroyIFrame"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
}<br />
<br />
<br />
It seems like there should be a manner aside from "embedded iframes" to get art defined in the DOM into the scene graph--perhaps the canvas tag is the way to go here? But maybe that's too webGL specific and won't work for an Ogre port of this<br />
<br />
==Attaching 3d Text to an Objects==<br />
I'm just brainstorming here: it seems like WebGL has facilities to do this efficiently, but I don't have a good use case except buildnig a rendering system inside a canvas tag or something?<br />
<br />
Perhaps the canvas tag is the way to go <br />
<br />
{<br />
msg:"Text",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
text:"This is a test of the emergency broadcast system",<br />
font:"size=+1"<br />
}<br />
<br />
{<br />
msg:"DestroyText",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
}<br />
<br />
==Particle System==<br />
===Adding a particle system to an object===<br />
This mimics the ogre interface and we introduce a number of billboard types<br />
point<br />
The default arrangement, this approximates spherical particles and the billboards always fully face the camera. <br />
oriented_common<br />
Particles are oriented around a common, typically fixed direction vector (see common_direction), which acts as their local Y axis. The billboard rotates only around this axis, giving the particle some sense of direction. Good for rainstorms, starfields etc where the particles will traveling in one direction - this is slightly faster than oriented_self (see below). <br />
oriented_self<br />
Particles are oriented around their own direction vector, which acts as their local Y axis. As the particle changes direction, so the billboard reorients itself to face this way. Good for laser fire, fireworks and other 'streaky' particles that should look like they are traveling in their own direction. <br />
perpendicular_common<br />
Particles are perpendicular to a common, typically fixed direction vector (see common_direction), which acts as their local Z axis, and their local Y axis coplanar with common direction and the common up vector (see common_up_vector). The billboard never rotates to face the camera, you might use double-side material to ensure particles never culled by back-facing. Good for aureolas, rings etc where the particles will perpendicular to the ground - this is slightly faster than perpendicular_self (see below). <br />
perpendicular_self<br />
Particles are perpendicular to their own direction vector, which acts as their local Z axis, and their local Y axis coplanar with their own direction vector and the common up vector (see common_up_vector). The billboard never rotates to face the camera, you might use double-side mater<br />
For further documentation about the properties see<br />
http://www.ogre3d.org/docs/manual/manual_35.html#SEC191<br />
http://www.ogre3d.org/docs/manual/manual_36.html#SEC208<br />
and<br />
<br />
<br />
{ <br />
msg:"ParticleSystem",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479"<br />
mesh:"http://example.com/billboard.dae"//the mesh should be rescaled to be a 1x1 mesh with <br />
particle_size:[20,20],<br />
cull_each:false<br />
quota:10000<br />
billboard:"oriented_self",<br />
sorted:false//defaults to false--whether the particles should be sorted<br />
local:false//defaults to false--if true rotation of the node after the emission of the particle will rotate it<br />
direction: [0,0,1],///the common direction for oriented_common or perpendicular_common<br />
up: [0,0,1],///Only required if billboard_type is set to perpendicular_self or perpendicular_common, this vector is the common up vector used to orient all particles in the system.<br />
accurate_facing:false//if the facing is set to the camera facing or calculated per billboard<br />
iteration_interval:.125//how often the particles are updated--if set to 0, defaults to framerate<br />
invisibility_timeout:10//how many seconds of being outside the frustum before the system stops updating<br />
}<br />
<br />
{<br />
msg:"DestroyParticleSystem"<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479"<br />
}<br />
<br />
<br />
once a system is created, particles need to be emitted from it. There should be a global map of default emitters named ParticleEmitters consisting of at least<br />
"Point","Box","Cylinder","Ellipsoid","Shell","Ring" and the extra attributes are specified in http://www.ogre3d.org/docs/manual/manual_38.html<br />
{<br />
msg:"ParticleEmitter",//add or upate a particle emitter<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
name:"Flare"<br />
type:"Ring"<br />
angle 15<br />
emission_rate 75<br />
time_to_live:[2.5,3]//range between 2.5 and 3<br />
direction [0, 1, 0]//3d vector<br />
speed:[250,300]//range between 250 and 300<br />
colour_range:[[1 0 0],[0 0 1]]//random color<br />
position:[0,0,0],<br />
repeat_delay:[2.5,5]<br />
<br />
}<br />
<br />
{<br />
msg:"RemoveParticleEmitter",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
name:"Flare"<br />
}<br />
<br />
There may be forces applied to the emitters<br />
and there must be a global map of affectors called ParticleAffector from which the relevent affector is selected consisting of at least<br />
<br />
LinearForce, ColourFader, Scaler, Rotator, ColourInterpolator, ColourImage, DeflectorPlane, DirectionRandomiser<br />
The detailed definitions are contanied at http://www.ogre3d.org/docs/manual/manual_40.html#SEC234<br />
<br />
<br />
<br />
<br />
{<br />
msg:"ParticleAffector",//add or upate a particle emitter<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
name:"TheForce"<br />
type:"LinearForce"<br />
force_vector: [0 -100 0]<br />
force_application: "add"<br />
<br />
}<br />
<br />
{<br />
msg:"RemoveParticleAffector",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
name:"TheForce"<br />
}<br />
<br />
==Specifying a terrain for the world==<br />
Would need to be chunked and in some sort of widely readable format---would be nice to be able to tap into google earth for terrain--ideas for how to do this are still very very early<br />
<br />
=Deprecated API Ideas=<br />
Here we put ideas we had but decided to discard so that they don't come up again as new ideas and may be discussed here and evaluated for re-addition if someone feels strongly they should be included<br />
<br />
<br />
==Skeleton file formats==<br />
<br />
The reason these were removed is that they are too brittle (it's hard to weigh an wave and walk animation and have the steps not be half as wide) and it's difficult to keep the skeletons out of trouble (i.e. feet through the ground)<br />
so we think thta the physics system in general should send the bone positions and timestamps since it's the arbiter of what intersects what--and it can always read the skeleton file format.<br />
<br />
===Animating a skeleton based on a time based animation===<br />
{<br />
msg:"Ani",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
time:489192048120984102,///milliseconds since 1970 that the animation should be started from (skip frames if now is later)<br />
animation:"http://example.com/animation.dae",<br />
loop:false,<br />
weight:1.0 ///how strong this animation should compare with other animations that use the same bones<br />
fadein:2.3 //how many seconds to fade in<br />
}<br />
<br />
Note that the animation.dae should have annotations for loop-in point and loop-out point within the .dae so that loop can intelligently function<br />
===Stopping a skeleton based on a time based animation===<br />
{<br />
msg:"AniStop",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
animation:"http://example.com/animation.dae"<br />
fadeout:1.0//how many seconds to fade out<br />
}<br />
<br />
<br />
==Should an object just be a sprite==<br />
We figured that a collada square file may be a more compat representation for a sprite and can contain the appropriate shader, materials, etc<br />
===Making an object a point sprite===<br />
This mimics the ogre interface and we introduce a number of billboard types<br />
point<br />
The default arrangement, this approximates spherical particles and the billboards always fully face the camera. <br />
oriented_self<br />
Particles are oriented around their own direction vector, which acts as their local Y axis. As the particle changes direction, so the billboard reorients itself to face this way. Good for laser fire, fireworks and other 'streaky' particles that should look like they are traveling in their own direction. <br />
perpendicular_self<br />
Particles are perpendicular to their own direction vector, which acts as their local Z axis, and their local Y axis coplanar with their own direction vector and the common up vector (see common_up_vector). The billboard never rotates to face the camera, you might use double-side mater<br />
For further documentation about the properties see<br />
http://www.ogre3d.org/docs/manual/manual_35.html#SEC191<br />
http://www.ogre3d.org/docs/manual/manual_36.html#SEC208<br />
<br />
{<br />
msg:"Sprite",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
sprite:"http://example.com/test.jpg",//the mesh should be rescaled to fit inside a unit sphere centered at 0,0,0<br />
billboard:"perpendicular_self"<br />
sorted:false//defaults to false--whether the particles should be sorted<br />
up: [0,0,1],///Only required if billboard_type is set to perpendicular_self, this vector is the common up vector used to orient all particles in the system.<br />
}<br />
<br />
===Removing point sprite property from object===<br />
{<br />
msg:"DestroySprite",<br />
id:"f47ac10b-58cc-4372-a567-0e02b2c3d479",<br />
}</div>Rryk