Sunday night I spend ages trying to find a decent Babbage-proof dress to wear at the November Aether Salon, were Miss Ordinal Malaprop would be lecturing on the topic of ‘Weapons!’ After an hour in inventory I came up with something that was acceptable - not terrible suited, but acceptable and teleported to the Aether Salon at Babbage Palisade. Upon arriving, I was quickly assigned a lovely Victorian seat to wear, making it a bit more easy to blend in with the massive steampunk crowd that gathered to listen to Miss Malaprop’s wise words.
Miss Malaprop’s talk was - as anticipated - enlightening, and she offered an easy categorisation for weapon owners. Personally, I think I fall under this category: “There are people whose _only_ real interest is in form. “Fashionistas,” if that is not an insulting term. A weapon is basically an accessory. It really doesn’t matter an awful lot quite what it does.” Sure, I’m totally happy with my Luger shooting blueberry muffins. :D
Another excerpt: “While preparing myself for this evening I was pondering on the nature of what a “weapon” actually is and really, the conclusion that I came to was that, at base, it is in fact a communications device. And not merely in the usual “only language they understand” sort of way. The underlying nature of a “weapon” is to communicate certain information - be that by a built-in protocol such as the Linden Damage System or via some sort of other more custom one.” I am only left to wonder what my Blueberry Shooting Luger & Couch gun communicate! ;) If you want to read more about Miss Malaprop’s thoughts on Second Life Weaponry, or benefit from the good advice she shares, I suggest you head over to the transcript on Babbage’s Ning Group General information on Miss Malaprop and her projects you can find at ordinalmalaprop.com.
Babbage & Cake
Right before I crashed, Dandellion - attending in classy steampunk robot gear - pointed out to me that not everybody seemed to be focussed on the nature and creation of weaponry. Some seemed more interested in the nature and consummation consumption of the tea & cake. Aren’t they adorable? (in a non pervert kinda way, thank you for needing to clarify that, RL media and in-world witch hunters! *sighs*)
You are a girl and want a gun?
Because I am in a giving mood - I’m feeling strange lately - and still a Virtual Girl with Gun, I’m buying the first lady avatar that replies here with the correct answer to ‘What was Ordinal Malaprop’s solution as to not have to use voice inSL?’ one of Miss Malaprop’s finest works, a vertical clock-loading pistol. You’re a bloke? Sorry, this is one of those cases of positive discrimination that sucks bigtime! ;)
Update: The below machinima was, sadly enough, not shot in Second Life, but is a demo of the the nVidia PhysX fluid physics. I refer you to this comparison of the nVidia PhysX fluid physics demo being rendered on the the CPU - Central Processing Unit - and then the GPU - Graphic Processiong Unit. Replacing Lindsey Warwick’s video with the test comparison one, and hoping she just ‘mistagged’ in stead of intentionally fooling me! ;)
Image for your Second Life: pools of blood and/or champagne, throwing water balloons, wet t-shirt contests, fountains of eternal avatar youth, … . All those might be already possible, if you just know how to activate them*, who knows what the SL physics engine already is capable off? Take a look at this machinima by Lindsey Warwick (human: Sara Carter), shiny bump maps, shading & liquid, bloody water, it has it all:
Don’t ask me ‘how’, I’m just as ‘muchos curiositos’ as you! ;) Things were obvious to graphical juicy to be true. :d If you already know, care to tell? But if the big flood would someday hit the grid, this end of Second Life’s drought would be a great step for virtual mankind! Still, shall we start building a solid prim ark, big enough to hold 60,000 concurrency? ;)
* For the wet tshirt contest tshirts, just alpha the entire tshirt at least 25% (so soft grey on your alpha channel in photoshop) or just the places you want to appear ‘wet’, et voila, the needed bits will shine through.
Imagine what flexible sculpties would do for the SL fashion industry (moving hoodies, bouncing leg warmers?), normal building (sculptie flags?) or even the SLex industry (non-static penises, breasts bouncing up and down as you run and bouncing sexgen beds?). Flexi sculpties, a dream of many - 357 votes, a lot for JIRA standards - and a real possibility as demonstrated in the videos below. The ‘hack’ that enables flex sculpts a viewer side patch, as shown in JIRA issue VWR-9203 Flexible Sculpted Prims.
Want to test? No problem, Chalica Yao compiled a Second Life client that implements these flexible sculpted prims patches. Available for download here (as Miss Chalice asked for Mirrors). You do not need to install this client, just unpack to a folder and run it.
Do mind that this is a large hack, so don’t expect prime performance and you do not want to leave sculpties like this all around the grid, so please, clean up after testing! Zwagoth Klaar, one of the ‘parents’ of this patch shares some good advice: ‘Appears that scale has to be applied using the method for normal prims, not the one for sculpts. This seems to fix things. It also brings up another issue with these. You lose half your resolution because things that are in the negative region of the flexible do not behave correctly, this caused me a large amount of confusion, it makes things warp and flip inside out, also causes distortions. So, for the moment, people wanting to create sculpts for this, should keep their exports above the 0,0,0 center of the sculpt frame, or it will not behave correctly. Objects flex along the Z axis, keep that in mind.‘
Once more, thanking Mr Malburns for pointing this out to me.
As per overall Garden law, I can’t show you photographs yet, but I don’t think this is a violation of that rule. This is what happens when Lauren and I need to hunt you down for artist information. *grins* Poor Mr. Cheen Pitney.
Anyway, believe me, things look great and very NPIRL - not possible in real life. Even the pathways, threes, music stage, avatars, … and press release can be found here. For more news, keep on eye on garden.rezzable.com - you’ll also find an overview of the creators and builders participating in the creation of the Garden of NPIRL delights there - and the NPIRL blog.
I’m curious what you opinions about the garden will be!
Torley posted about the numbers that define the ARC - Avatar Rendering Cost - as declared by Runitai Linden:
A base avatar begins with a score of 1
5 points added for each unique texture on the avatar (not counting the base skin). Rationale: Unique textures break batches, create CPU overhead for decoding, and consume GPU memory bandwidth. However, note that this is across the avatar- so two unique textures across 10 prims only count as two unique textures!
All attachments are then looked at on a per-prim level. The prims are weighted as follows:
10 base points for having the prim.
1 point added if prim is invisible, shiny, or glowing (each counts). Rationale: Invisiprims/shiny/glow create a small amount of overhead by breaking batches or requiring an extra render pass.
1 point added for each planar-mapped face of the prim. Rationale: Planar mapping creates a small amount of CPU overhead that gets worse with flexible objects.
1 point added per meter, per axis, of the prim’s size. Rationale: Bigger prims are higher LOD and create more fill.
4 points added if prim has bump applied. Rationale: Bump mapping breaks batches and requires a register combiner, and creates a lot of CPU overhead when coupled with a flexible object.
4 points added for each transparent face of the prim. Rationale: Alpha creates a lot of overhead by needing to be sorted every frame AND by breaking batches.
4 points added for each animated textured face of the prim. Rationale: Animated textures break batches and require the use of a texture matrix.
8 points added if prim is flexi. Rationale: Flexible objects create a lot of CPU overhead and consume graphics bus bandwidth.
16 points added if prim is a particle emitter. Rationale: Particles create even MORE CPU overhead and consume graphics bus bandwidth.
Note that these weightings, and their resultant totals, are not a *perfect* measure of your cost- but more of a relative counter to weigh against other avatars. Point: it’s close, but it’s not scientifically perfect. For that, you’d have to delve deep into batch sizes and draw calls. These weightings and their description/rationales were written by Runitai Linden, one of our most senior graphics engineers and a man who knows rendering efficiency! :)
And another note- ironically, the Avatar Rendering Cost display itself is CPU intensive, since it’s essentially profiling all the avatars all the time. So a good practice would be to walk around and turn it on when you want a glimpse- otherwise if you turn it on at all times, know your performance is suffering!
Sculpties…. or sculpted prims… they will always stay a no-go for me, I’m afraid. I understand how they work - a Sculpt Texture or Sculpt Map is a standard RGB texture where the R (red), G (green) and B (blue) channels are mapped onto X, Y, and Z space - but can’t create a good looking sculptie myself, I fear I think I think to much in 2D still.
But that I can’t build, should not stop me from learning and experimenting! *grins*
Thanks to Miss Kryptonia Paperdoll, I’ve discovered Mr. Mango Splash’s in-world sculpted prim generator. It’s an incredible thing that allows you to create sculpt maps in Second Life, without using external 3D software such as Maya, Blender, … . You do need some basic Second Life building skills to select prims, move them, …. .
The Second Life Sculpted Prim Generator Mr Splash build works as follows:
The Sculptie Generator rezzes the basic shape for you: be it a cylinder or a sphere, and with a level of detail you can choose yourself. That basic shape exists of ‘nodes’ which are represented by little white spheres.
By moving those white spheres, you change the shape of the sculptie. You can edit multiple spheres at the same time, by selecting them all and than using the resize tool. Handy if you want to work on something symmetric.
Just to be safe, tell the Sculptie Generator to safe your work in a box.
Then you tell the Sculpted Prim Generator to rezz a sculpt map - the texture you need for sculpties - based on the object you just created. It’s marvellous to see the Sculpt Map Generator do it’s work. It figures out the place of the white spheres, takes it coordinates and reverts that to a RGB value. In a minute’s time, you have a sculpt texture rezzed before you in Second Life.
Allow the Sculptie Generator to lock your camera by sitting on it. Now you only need to snapshot the texture (made out of a whole lot of colour coded prims) and upload the snapshot.
Rez a box, set it to sculptie and drag the snapshot you just took where the sculpt texture should go. Et voila: a very ugly mushroom. (See the ‘Vint can’t build’ disclaimer.)
When Tateru over at Massively wrote about ‘Avatar Rendering Cost’, a new feature in RC 1.20’s Advanced (former: ‘client’) menu, she asks a good question: ‘Will we see products one day labeled like this? “Purse, scripted, 8 prims, 90ARC“‘
One of the things that the new Second Life 1.20 release candidate viewer sports is a nifty new feature that busts some serious myths about avatar related lag. In short, by enabling the option, every avatar gets a number displayed over their heads showing how much work your PC needs to go through to render the avatar. This is the avatar rendering cost.
So far we have seen green numbers (low numbers, which are good), and red ones (high numbers, which are not so good). The amount of work that goes into rendering an avatar (now that we can easily measure it) isn’t quite affected by things the way we thought it was.
You activate the showing of ARC numbers the following way: Advanced > Rendering > Info Displays > Avatar Rendering Cost.
Curious about my own ‘rendering footprint’ - opposed to my ecological one - I did some ARC testing, browsing through the different ‘avatar versions’ I use often. Impressive is, that different hairstyles could mean a change of +5000 ARC units. And that my Greenie avatar came out pretty well. I should take a look a Quickly Alter - which is just one, huge sphere - too.
Next test: put 60 avatars with ARC 8000 on a sim and compare that with 60 avatars ARC 400 on the same sim? I think that - although this is an important factor and fun data to toy around with - we must surely not ’stare’ blind on avatar lag. How our avatar looks, for most of us, does depict ‘who we are’ in the virtual world. With big events the Avatar Render Cost will be far from the sole factor determining the amount of ‘laggyness’: optimal build quality, draw distances, scripts, sim class, state of the grid, … are not to be forgotten about.But yes, Miss Tateru, I would like to know the ARC of a hairstyle before I buy it! ;)
As most Google Tech Talks, this is way above my virtual tea cup - which prefers to just travel and be amazed - but even grasping not the complete 50% this is interesting. For virtual worlds to have a future, we will definitely need well-build ‘cities’ - or urban areas - most probably even ones that resemble their First Life bothers and sisters very close. The problems pointed out in this video, the ‘Content Challenge’ is definitely true for Second Life:
Architectural content like cities, buildings and interiors is extremely important - but also very complex.
No tools available for efficient creation of detailed architectural 3D models.
The solution to this presented here is ‘shape grammar’ which leads to automatic content creation. They call this procedural modeling. Very, very, very much simplified, this means, that once you have your ’shape grammar’ in place, you give the computer a map and it figures out which buildings to build, where to put roads, what the buildings should look like, … which gives a very uniform results. Of course, did I mention this is very much simplified? ;)
You then can fine-tune the resulting city, buy making small are larger changes to it’s ’shape grammar’. For instance, you’ve build a Roman city, but used the wrong time area and inserted Doric columns in stead Ionic ones? Easily corrected by changing the program’s grammar. Again, I must add ‘very much simplified’ to this.
The procedural modeling workflow shortly summarized:
Architectural design idea
Analyse design and it’s parameters.
Create and define:
Create or get elements/textures.
Define city layout /initial shape(s).
Encode design: rule set(s)
Add stochastic behaviour to rules.
Apply rule set(s) and export models.
All this has lead to a commercial program called The CityEngine that does procedural modeling of CG architecture and is set for release May 2008. The use they see for this? Gaming, entertainment, urban planning, archeology, … . I’m sure you can think of some use for this of your own. Impressive, nèh?
Google Talk explenation on ‘Urban Reconstruction and Modeling for Building Virtual Worlds’: Creating digital content for virtual worlds remains a significant challenge, especially for urban environments, which are among the largest and most complex. As display capabilities improve and audience expectations grow, procedural modeling techniques are becoming an increasingly important supplement to traditional modelling software. In this talk, we present grammar-based, image-based and interactive methods for the efficient creation of urban environments. Thus massive architectural models of high visual quality and geometric detail can be produced at low cost. Selected examples demonstrate solutions to previously unsolved modeling problems, especially to consistent mass modeling with volumetric shapes of arbitrary orientation. Furthermore, we show massive urban models with unprecedented level of detail, with the virtual rebuilding of the archaeological site of Pompeii as a case in point.
It’s not that I want to encourage anybody to do so - although having two virtual worlds can do no harm? - but this is one of the things that are ‘way beyond my cup of tea but highly fascinating to watch’. I suggest you IM Mike Sutton’s avie Lightninboy Snook with any questions. =p
What does SL2MV - short for Second Life to Multiverse Content Pipeline - do?
Set of integrated programs to take static 3D Models from Second Life to Multiverse
Creates DAE Model that can be imported into most 3D Modeling tools (3D Max, Maya, Blender)
Creates OGRE 3D Mesh and Material Files for Multiverse
Creates Texture Files
Imports mesh, materials and textures into Multiverse World listed in Repository
And what do you need?
DOS Batch Files
GL Intercept by Damian Trebilco
Modified Open GL Extractor (OGLE) by Eyebeam OpenLab
‘Finally, I will be able to do fairly decent looking sculpted prims!‘, I thought, when reading Hamlet’s Au’s article on Plopp Second Life. After some experimenting with it, it seems like it’s not going to help very much at allowing a sculpties noob like me create beautiful (or practical) sculpties like for instance Juliet Ceres does. Apparently, doing that is either ‘not in this lifetime’ for me, or at least only after learning how to work Maya or 3Dmax or whatever. *sobs*
Maybe a child can grasp Plopp Second Life’s interface, but I could not. The thingie that looks like a garbage bin is actually a spray can and does not delete stuff. The rest isn’t very clear either. What’s wrong with ‘file’, ‘edit’, and ‘options’ menu’s?! The only thing that does make sense is the ‘day - night’ clock. Which is actually fun if you spin it very fast. :d But then again, how do I kill those annoying sound effects?
The drawing tool isn’t that great, but as Plopp does accept importing .png files - including alpha layer - I tried to do things that way. If only now there would have been a way to tell it that my precious Linden Dollar coin is a coin, and not a sphere, I would have certainly started an affair with Plopp Second Life - despite the fact that it can not hold it’s mouth sounds shut.
The Plopp Second Life site states the following ‘to do’ list:
improvement of the conversion to Sculpted Prims
sculpture preview within PloppSL
see how your sculptures will look in SecondLife™
automatic generation of sculpties for each part in the original image
currently we only support closed models without holes
Someone *beep* me when they have finished doing that? The idea of ‘just painting’ something and then converting it to a sculptie is great, the execution, well… *points at the to do list*.
Until the Plopp people have checked all that as ‘done’, you might want to try one of the following ’sculpted prim tools’:
Wings3D: A free and open source polygon mesh subdivision modeller. Wings 3D is available for most platforms, including Windows, Linux and Mac OS X. Wings 3D is ideally suited for modeling and texturing low to medium density polygon meshes. It has a wide range of very effective tools optimised for these tasks hidden behind its ‘minimalistic’ interface.
Blender: Formerly a company’s in-house tool, Blender is the current king of the open source modeling programs. With all the features of the expensive programs, an active development community and even some existing SL-based tools made by Residents, this is going to be the default choice for many people. Downsides: Blender’s interface is not newbie-friendly. That combined with spotty documentation can make for a slow learning curve. But make a good start by visiting Amanda Levitski’s tutorial on how to export the sculpties from Blender.
SculptyPaint: SculptyPaint first was written for creating 3D sculpt models for SecondLife. Currently it can also export to .dxf files that can be read in Blender, google sketchup and other 3Dmodelling software.
ROKURO: You draw a line in 2d by editing the various points and the program effectively spins that line around an axis to create the 3d object. Cylinders and polygonal prisms are both possible.