|
From: | Florian Kainz |
Subject: | Re: [Openexr-devel] Interpreting Deep Images, Revised Document |
Date: | Tue, 29 Oct 2013 16:59:28 -0700 |
User-agent: | Mozilla/5.0 (X11; Linux x86_64; rv:17.0) Gecko/20130626 Thunderbird/17.0.7 |
For the most part the document describes how deep files are used in practice. It tries to avoid changing existing conventions. Since real-world deep files are commonly messy, the document cannot pretend they are all tidy. The main use case I can see for tidy files is for deep texture maps such as shadow buffers, but even tidy files are not efficient for that. If deep shadow buffers and similar applications are considered important enough then there should be a separate convention for storing cumulative color and opacity in a tidy form, preferably one that does not require computing exponentials and logarithms during lookups. This would require no library changes. Florian On 10/29/2013 04:38 PM, Larry Gritz wrote:
Can you elaborate on why the new document puts it on the head of the reader to deal with overlapping or unsorted samples? What are the circumstances in practice that lead to "MESSY" deep files? -- lg On Oct 28, 2013, at 9:39 AM, Florian Kainz <address@hidden> wrote:- Samples in a single pixel are explicitly allowed to be unsorted and overlapping. The previous version assumed that samples always had to be sorted and non-overlapping, but that is not how deep images are used in practice.-- Larry Gritz address@hidden
[Prev in Thread] | Current Thread | [Next in Thread] |