2021-ongoing;
Machine learning algorithms, custom software, original dataset, multichannel video installation;
Machine learning technical assistance: Artem Konevskikh
3D development: Enrico Zago
'Content Aware Studies II' refers back to earlier series 'Content Aware Studies' (video documentation link), which through practice-based inquires seeks to examine investigative capacities of AI not only in cultural production but also in synthetic forms of knowledge production and automation of research and historiography using classical sculpture as a case study.
Preoccupied with the issues of biases in AI-driven research practices today, I would like to propose the next phase of CAS, in which I would challenge previously established AI-based methodologies, against data from prehistoric and geologic time archives including first stone tools, writing systems, paleontological archives of fossilized plants, organisms, and other biogenic data. The datasets will be primarily comprised of the findings archived and documented in repositories of contemporary natural history museum collections. Based on these datasets I will involve AI technics (incl. GANs), to generate blueprints of new instances to be produced in organic materials, including bio-ceramic 3D printing and artificial fossilization technics, developed by researchers of School of Earth Sciences at the University of Bristol. Much like in the previous phase of CAS, objects of neural antiquity were produced in marble, this time I would also like to focus on the materiality of these future exercises in synthetic ontology. As a result a series of sculptural and screen-based installation works will be produced. How different would an Ai-composed or, ontologically speaking, — synthetic plant fossil seem as opposed to an actual sample from prehistoric floras? Or will AI-manufactured proposals of newly rendered specimens be distinguishable from the remaining millions of actually existed species that never made it to get scientifically cataloged?
And, finally, what would that mean to actually produce such objects involving artificial fossilization technics in terms of philosophical concerns around ontology, agency, and materiality of organic and inorganic subjects? Or what would algorithmically derived and 3d-printed in calcium phosphate bone remnants of prehistoric species look like? Such engineered bone tissues or artificially maturated stone imprints may come across indistinguishable from genuine paleontological findings. Finally, as a form of presentation, I would like to offer an exposition of objects, resembling a natural history museum hall, or more precisely, that of a museum of synthetic history. I would like to raise these questions, as forms of thought-object experiments, while examining aesthetic, empirical, symbolic and functional qualities of derived objects exhibited alongside genuine objects of natural history. What other domain of natural sciences will emerge when the history and ontology of floras, faunas, single-celled organisms, yeasts, moulds, rocks, minerals and those of unearthly origin, is studied by the algorithmic forms of knowing? Or in other words, I would like to design agents of artificial, automated reasoning committed to conceptualization of their own emergence and production of its own history and artifacts.
Issues at hand & the relationship between technology and culture.
Aforementioned AI technics are quickly becoming ubiquitously instrumentalized in the investigation of historical documents, including the Voynich Manuscript, collaborations between the British Library and Turing Institute, and others, as reported by Nvidia, and also used as a predictive instrument for modelling and designing futures. However, before celebrating such advancements we might also want to critically examine the role of such forms of knowledge production; how does one distinguish between accelerated forms of empirical investigation and algorithmic bias? Will the question hold when this is the-new-normal of historiography? Preoccupied with these warnings and questions on biases, authenticities, immaterialities, automations and historicities CAS attempts to examine what visual and aesthetic qualities for such guises are conveyed when rendered by a synthetic agency and perceived through our anthropocentric lens. What of our historical knowledge and interpretation, encoded into the datasets will survive this digital digestion? It examines new forms of historical knowledge and artistic production and calls into question the ethical implications of such approaches in relation to culture and the notion of the endangered anthropocentric world.
Biodiversity.
Recent scientific insights highlight the pressing issue of biodiversity, as estimates suggest that around 10 million species exist on Earth, with only 10–20% currently identified and described. The accelerating biodiversity extinction crisis has led to a reliance on “Big Data” production focused on species observation-based occurrences rather than specimen-based documentation, which is crucial for effective mapping and protection of biodiversity. Despite 300 years of exploration and cataloging, only a fraction of species has been well documented, resulting in significant gaps in high-quality biodiversity data and the proliferation of misidentified records. This lack of comprehensive knowledge generates a questionable foundation for future generations, as the history of life on Earth risks becoming confined to a limited dataset, perpetuating a feedback loop reliant on existing information and trending computational methods, such as data-driven and AI-powered research techniques.
Artificial Fossilization
The University of Bristol, under the supervision of Jakob Vinther, Evan Saitta and their team have been conducting research into artificial fossilization. The aim of their developed methodology is to aid in the process of finding fossils in order to continue the aim of completing our archives of biodiversity and understanding of paleontological history by reverse engineering fossils. Their published experimental protocol may indeed change the way fossilization is studied, as they’ve unlocked methods to manipulate time, not the least force behind the creation of a fossil. Through specially developed techniques directed to produce artificial fossils, the research group managed to synthetically compress millions of years of natural processes into a single day in the lab.
Those artificial fossils are synthetic by origin, yet visually indistinguishable from the genuine ones, and as material analysis reveals structurally very similar according to the claims in their paper published in 2018 (Saitta, Kaye and Vinther, 2018). ‘Artificial maturation’, is an approach where high heat and pressure accelerate the chemical degradation reactions that normally occur over millennia when a fossil is buried deep and exposed to geothermal heat and pressure from overlying sediment. Maturation has been a staple of organic geochemists who study the formation of fossil fuels, and is similar to the more intense experimental conditions that produce synthetic diamonds.
“The approach we use to simulate fossilisation saves us from having to run a seventy-million- year-long experiment,” reported Saitta," We were absolutely thrilled. We kept arguing over who would get to split open the tablets to reveal the specimens. They looked like real fossils - there were dark films of skin and scales, the bones became browned. Even by eye, they looked right." (Starr 2018; Field Museum, 2018)
In their own words, they nickname the procedure easy-Bake fossils’, gamifying objects of history as their purpose becomes another one entirely, specifically that of a tool. They describe the possibilities of their approach as ones of reverse engineering. “Our experimental method is like a cheat sheet. If we use this to find out what kinds of biomolecules can withstand the pressure and heat of fossilization, then we know what to look for in real fossils.”(Field Museum, 2018). In this case, archaeological practice becomes a matter of knowing what to look for, as opposed to trying to find the undiscovered. From Saitta’s statement, we understand that in order to combat such problems like the biodiversity crisis and similar problematics in the palaeontological field they intend to attempt to work backwards; To take an organism or marker which is currently in existence, create an artificial fossil of it and review what remains after over a millennium of ageing processes. The remaining markers then become guidelines of what to search for and if found become a new string of the historical narrative of this planet. Peculiarly, we are now faced with a type of' reversed archaeology', where history is predetermined in a lab and fieldwork becomes a matter of finding the piece which fits the artificially created template. A painting-by-numbers type of palaeontological puzzle, which leads to yet another type of recycling of knowledge as opposed to random discovery through seeking; Similar to the problematics which occur when attempting to expand biodiversity data using AI techniques on an existing dataset. The consequence of machine-learning knowledge production is that AI approaches questions with the intention of solving them, no matter how much force it must apply to mould the existing data into a solution to the set task. How big is the gap between a DeepDream plate of spaghetti and meatballs morphing into a hellscape of dogs as AI constructs the hallucination with brute force, to archaeologists creating 'easy-bake' versions of fossils and scouring the earth for their counterparts potentially blind to the unknown and undiscovered data around them?
Synthetic Objects
The ‘Museum of Synthetic History’ builds on these ideas. Preoccupied with the issues of biases in AI-driven research practices today, the ‘Museum of Synthetic History‘ challenges previously established AI-based methodologies, against data from prehistoric and geologic time archives including first stone tools, writing systems, palaeontological archives of fossilised plants, organisms, and other biogenic data. How different would an Ai-composed or, ontologically speaking, : synthetic plant fossil seem as opposed to an actual sample from prehistoric floras? Or will AI-manufactured proposals of newly rendered specimens be distinguishable from the remaining millions of actually existing species that never made it to get scientifically catalogued? And, finally, what would that mean to actually produce such objects involving artificial fossilisation techniques in terms of philosophical concerns around ontology, agency, and materiality of organic and inorganic subjects? Or what would bone remnants of prehistoric species look like if they were algorithmically composed and then 3D-printed in calcium phosphate? Such engineered bone tissues or artificially maturated stone imprints may come across as indistinguishable from genuine palaeontological findings. What new domains of natural sciences will emerge when the history and ontology of floras, faunas, single-celled organisms, yeasts, moulds, rocks, minerals and those of unearthly origin, are studied by algorithmic forms of knowing? In other words, we can even go so far as to say that the project ‘Museum of Synthetic History’ is a thought-object experiment into simulating a situation in which the agents of artificial, automated reasoning committed to the conceptualisation of their own emergence and production of their own history and artefacts.
Collaboration with the Naturhistorisches Museum Wien
Сollaboration has been established with the Naturhistorisches Museum's Geology Department. The museum’s research team has provided access to their extensive collection of objects located across multiple floors of the museum’s collection. Additionally, access to the museum’s 3D scanning equipment has been granted, along with a 3D scanner sourced from the architectural department of the Academy of Fine Arts. Approximately 1,000 objects have been scanned over the course of 18 months of weekly visits to the museum's archives.
Scanned objects | 963 |
Proceed scans | 718 |
Renders | 128085 |
Generated content incl. depth maps, images and 3d models | 1032 |
Custom datasets | 17 |
Preliminary Selections for CNC | 21 |
3D Scanning.
A powerful computing system for machine learning has been assembled, featuring a custom-built liquid cooling loop, 4 x Nvidia 3090 GPUs connected via 2 x NvLink Bridges for pooled memory, an AMD Ryzen 5950x CPU, and 64 GB of RAM in one system.
During recent scanning sessions in the archives of the Naturhistorisches Museum Wien, a total of 954 items were successfully scanned, encompassing a diverse range of specimens, including flora, ammonites, and various other species. Currently, about 150 scans from these sessions are being processed to refine and prepare the data for further use and analysis. Additionally, the dataset has been enriched with 2,812 items, comprising both flora and fauna specimens sourced from publicly available 3D scans.
After researching 3D model generation, we considered two primary approaches: depth map generation and SDF generation. The depth map approach is most suitable for flat objects, such as flora imprints, due to its high accuracy, but it requires significant manual effort for assembling models from generated parts. Conversely, the SDF approach was selected for volumetric objects, prioritizing overall structure preservation with less detail but enabling direct 3D model generation and interpolation without human intervention. We tested three neural network architectures: SDF-StyleGAN, Diffusion-SDF, and SDFFusion. Although diffusion models offer higher accuracy, they demand greater computing resources and require extensive code modifications for custom datasets. Ultimately, we opted for SDF-StyleGAN, which operates on consumer-grade GPUs and allows for more flexibility in data processing.
SDF-StyleGAN does not come with a dataset loader by default. We have developed a custom dataset loader that converts meshes (in OBJ or STL format) to SDF in machine-readable format.
Two training experiments were performed. First, the SDF-StyleGAN was trained on the entire dataset of fossils collected at that time. This experiment revealed that the neural network struggled to learn with multi-domain data, resulting in very average outcomes. Consequently, a smaller dataset comprising only ammonite 3D scans was created for training. As anticipated, the neural network demonstrated improved performance with the single-domain dataset, successfully generating high-quality ammonite-like samples.
In this approach, we work with 2D representations of 3D models, where spatial information is encoded in grayscale images called depth maps. It has certain limitations, for example, a depth map only describes part of the object visible from a given viewpoint, not the whole thing. It is therefore best suited to surfaces and flat objects. However, the approach also has significant advantages. Because it works with images, it requires less computing power and allows you to work at relatively high resolution on consumer-grade GPUs. It also opens up access to a wide variety of neural networks designed for image generation. For this approach, two neural networks are being considered. StyleGAN2-ADA, recognized as one of the most efficient generative adversarial network architectures, will be utilized for depth map generation, training on depth maps obtained from scanned objects to create new ones.
Additionally, experimentation with CUT (Contrastive Unpaired Translation), a pix2pix-like architecture, is planned; it trains on pairs of images of different types and enables the conversion of one image type into another. In this case, CUT will be trained on pairs of depth maps and renders to facilitate texture reconstruction for generated objects. This method will also enable the conversion of photographs of fossil specimens into depth maps, presenting both artistic and scientific value by allowing for the study of the volumetric qualities of specimens archived only as photographs.
A custom TouchDesigner patch is being utilized to create the depth maps. In addition, a new version of the dataset creation tool is under development based on Blender, an open-source and cross-platform software. This tool aims to produce depth maps and renders of objects, using either default textures or predefined materials.
StyleGAN2-ADA was trained on the depth maps of the collected fossils, demonstrating effective performance with multi-domain datasets that include various types of fossils. The resultant images were then utilized for the manual reconstruction of the skull.
The first physical object of the series, titled CAS_201, has been successfully generated using a custom-developed AI-based methodology, processed, and 3D printed in polyamide. Below are four views of the object.
Video Installation Materials
Shows and events featuring Content Aware Studies II Documents of Synthetic Histories
2021-ongoing;
Machine learning algorithms, custom software, original dataset, multichannel video installation;
Machine learning technical assistance: Artem Konevskikh
3D development: Enrico Zago
'Content Aware Studies II' refers back to earlier series 'Content Aware Studies' (video documentation link), which through practice-based inquires seeks to examine investigative capacities of AI not only in cultural production but also in synthetic forms of knowledge production and automation of research and historiography using classical sculpture as a case study.
Preoccupied with the issues of biases in AI-driven research practices today, I would like to propose the next phase of CAS, in which I would challenge previously established AI-based methodologies, against data from prehistoric and geologic time archives including first stone tools, writing systems, paleontological archives of fossilized plants, organisms, and other biogenic data. The datasets will be primarily comprised of the findings archived and documented in repositories of contemporary natural history museum collections. Based on these datasets I will involve AI technics (incl. GANs), to generate blueprints of new instances to be produced in organic materials, including bio-ceramic 3D printing and artificial fossilization technics, developed by researchers of School of Earth Sciences at the University of Bristol. Much like in the previous phase of CAS, objects of neural antiquity were produced in marble, this time I would also like to focus on the materiality of these future exercises in synthetic ontology. As a result a series of sculptural and screen-based installation works will be produced. How different would an Ai-composed or, ontologically speaking, — synthetic plant fossil seem as opposed to an actual sample from prehistoric floras? Or will AI-manufactured proposals of newly rendered specimens be distinguishable from the remaining millions of actually existed species that never made it to get scientifically cataloged?
And, finally, what would that mean to actually produce such objects involving artificial fossilization technics in terms of philosophical concerns around ontology, agency, and materiality of organic and inorganic subjects? Or what would algorithmically derived and 3d-printed in calcium phosphate bone remnants of prehistoric species look like? Such engineered bone tissues or artificially maturated stone imprints may come across indistinguishable from genuine paleontological findings. Finally, as a form of presentation, I would like to offer an exposition of objects, resembling a natural history museum hall, or more precisely, that of a museum of synthetic history. I would like to raise these questions, as forms of thought-object experiments, while examining aesthetic, empirical, symbolic and functional qualities of derived objects exhibited alongside genuine objects of natural history. What other domain of natural sciences will emerge when the history and ontology of floras, faunas, single-celled organisms, yeasts, moulds, rocks, minerals and those of unearthly origin, is studied by the algorithmic forms of knowing? Or in other words, I would like to design agents of artificial, automated reasoning committed to conceptualization of their own emergence and production of its own history and artifacts.
Issues at hand & the relationship between technology and culture.
Aforementioned AI technics are quickly becoming ubiquitously instrumentalized in the investigation of historical documents, including the Voynich Manuscript, collaborations between the British Library and Turing Institute, and others, as reported by Nvidia, and also used as a predictive instrument for modelling and designing futures. However, before celebrating such advancements we might also want to critically examine the role of such forms of knowledge production; how does one distinguish between accelerated forms of empirical investigation and algorithmic bias? Will the question hold when this is the-new-normal of historiography? Preoccupied with these warnings and questions on biases, authenticities, immaterialities, automations and historicities CAS attempts to examine what visual and aesthetic qualities for such guises are conveyed when rendered by a synthetic agency and perceived through our anthropocentric lens. What of our historical knowledge and interpretation, encoded into the datasets will survive this digital digestion? It examines new forms of historical knowledge and artistic production and calls into question the ethical implications of such approaches in relation to culture and the notion of the endangered anthropocentric world.
Biodiversity.
Recent scientific insights highlight the pressing issue of biodiversity, as estimates suggest that around 10 million species exist on Earth, with only 10–20% currently identified and described. The accelerating biodiversity extinction crisis has led to a reliance on “Big Data” production focused on species observation-based occurrences rather than specimen-based documentation, which is crucial for effective mapping and protection of biodiversity. Despite 300 years of exploration and cataloging, only a fraction of species has been well documented, resulting in significant gaps in high-quality biodiversity data and the proliferation of misidentified records. This lack of comprehensive knowledge generates a questionable foundation for future generations, as the history of life on Earth risks becoming confined to a limited dataset, perpetuating a feedback loop reliant on existing information and trending computational methods, such as data-driven and AI-powered research techniques.
Artificial Fossilization
The University of Bristol, under the supervision of Jakob Vinther, Evan Saitta and their team have been conducting research into artificial fossilization. The aim of their developed methodology is to aid in the process of finding fossils in order to continue the aim of completing our archives of biodiversity and understanding of paleontological history by reverse engineering fossils. Their published experimental protocol may indeed change the way fossilization is studied, as they’ve unlocked methods to manipulate time, not the least force behind the creation of a fossil. Through specially developed techniques directed to produce artificial fossils, the research group managed to synthetically compress millions of years of natural processes into a single day in the lab.
Those artificial fossils are synthetic by origin, yet visually indistinguishable from the genuine ones, and as material analysis reveals structurally very similar according to the claims in their paper published in 2018 (Saitta, Kaye and Vinther, 2018). ‘Artificial maturation’, is an approach where high heat and pressure accelerate the chemical degradation reactions that normally occur over millennia when a fossil is buried deep and exposed to geothermal heat and pressure from overlying sediment. Maturation has been a staple of organic geochemists who study the formation of fossil fuels, and is similar to the more intense experimental conditions that produce synthetic diamonds.
“The approach we use to simulate fossilisation saves us from having to run a seventy-million- year-long experiment,” reported Saitta," We were absolutely thrilled. We kept arguing over who would get to split open the tablets to reveal the specimens. They looked like real fossils - there were dark films of skin and scales, the bones became browned. Even by eye, they looked right." (Starr 2018; Field Museum, 2018)
In their own words, they nickname the procedure easy-Bake fossils’, gamifying objects of history as their purpose becomes another one entirely, specifically that of a tool. They describe the possibilities of their approach as ones of reverse engineering. “Our experimental method is like a cheat sheet. If we use this to find out what kinds of biomolecules can withstand the pressure and heat of fossilization, then we know what to look for in real fossils.”(Field Museum, 2018). In this case, archaeological practice becomes a matter of knowing what to look for, as opposed to trying to find the undiscovered. From Saitta’s statement, we understand that in order to combat such problems like the biodiversity crisis and similar problematics in the palaeontological field they intend to attempt to work backwards; To take an organism or marker which is currently in existence, create an artificial fossil of it and review what remains after over a millennium of ageing processes. The remaining markers then become guidelines of what to search for and if found become a new string of the historical narrative of this planet. Peculiarly, we are now faced with a type of' reversed archaeology', where history is predetermined in a lab and fieldwork becomes a matter of finding the piece which fits the artificially created template. A painting-by-numbers type of palaeontological puzzle, which leads to yet another type of recycling of knowledge as opposed to random discovery through seeking; Similar to the problematics which occur when attempting to expand biodiversity data using AI techniques on an existing dataset. The consequence of machine-learning knowledge production is that AI approaches questions with the intention of solving them, no matter how much force it must apply to mould the existing data into a solution to the set task. How big is the gap between a DeepDream plate of spaghetti and meatballs morphing into a hellscape of dogs as AI constructs the hallucination with brute force, to archaeologists creating 'easy-bake' versions of fossils and scouring the earth for their counterparts potentially blind to the unknown and undiscovered data around them?
Synthetic Objects
The ‘Museum of Synthetic History’ builds on these ideas. Preoccupied with the issues of biases in AI-driven research practices today, the ‘Museum of Synthetic History‘ challenges previously established AI-based methodologies, against data from prehistoric and geologic time archives including first stone tools, writing systems, palaeontological archives of fossilised plants, organisms, and other biogenic data. How different would an Ai-composed or, ontologically speaking, : synthetic plant fossil seem as opposed to an actual sample from prehistoric floras? Or will AI-manufactured proposals of newly rendered specimens be distinguishable from the remaining millions of actually existing species that never made it to get scientifically catalogued? And, finally, what would that mean to actually produce such objects involving artificial fossilisation techniques in terms of philosophical concerns around ontology, agency, and materiality of organic and inorganic subjects? Or what would bone remnants of prehistoric species look like if they were algorithmically composed and then 3D-printed in calcium phosphate? Such engineered bone tissues or artificially maturated stone imprints may come across as indistinguishable from genuine palaeontological findings. What new domains of natural sciences will emerge when the history and ontology of floras, faunas, single-celled organisms, yeasts, moulds, rocks, minerals and those of unearthly origin, are studied by algorithmic forms of knowing? In other words, we can even go so far as to say that the project ‘Museum of Synthetic History’ is a thought-object experiment into simulating a situation in which the agents of artificial, automated reasoning committed to the conceptualisation of their own emergence and production of their own history and artefacts.
Collaboration with the Naturhistorisches Museum Wien
Сollaboration has been established with the Naturhistorisches Museum's Geology Department. The museum’s research team has provided access to their extensive collection of objects located across multiple floors of the museum’s collection. Additionally, access to the museum’s 3D scanning equipment has been granted, along with a 3D scanner sourced from the architectural department of the Academy of Fine Arts. Approximately 1,000 objects have been scanned over the course of 18 months of weekly visits to the museum's archives.
Scanned objects | 963 |
Proceed scans | 718 |
Renders | 128085 |
Generated content incl. depth maps, images and 3d models | 1032 |
Custom datasets | 17 |
Preliminary Selections for CNC | 21 |
3D Scanning.
During recent scanning sessions in the archives of the Naturhistorisches Museum Wien, a total of 954 items were successfully scanned, encompassing a diverse range of specimens, including flora, ammonites, and various other species. Currently, about 150 scans from these sessions are being processed to refine and prepare the data for further use and analysis. Additionally, the dataset has been enriched with 2,812 items, comprising both flora and fauna specimens sourced from publicly available 3D scans.
A powerful computing system for machine learning has been assembled, featuring a custom-built liquid cooling loop, 4 x Nvidia 3090 GPUs connected via 2 x NvLink Bridges for pooled memory, an AMD Ryzen 5950x CPU, and 64 GB of RAM in one system.
After researching 3D model generation, we considered two primary approaches: depth map generation and SDF generation. The depth map approach is most suitable for flat objects, such as flora imprints, due to its high accuracy, but it requires significant manual effort for assembling models from generated parts. Conversely, the SDF approach was selected for volumetric objects, prioritizing overall structure preservation with less detail but enabling direct 3D model generation and interpolation without human intervention. We tested three neural network architectures: SDF-StyleGAN, Diffusion-SDF, and SDFFusion. Although diffusion models offer higher accuracy, they demand greater computing resources and require extensive code modifications for custom datasets. Ultimately, we opted for SDF-StyleGAN, which operates on consumer-grade GPUs and allows for more flexibility in data processing.
SDF-StyleGAN does not come with a dataset loader by default. We have developed a custom dataset loader that converts meshes (in OBJ or STL format) to SDF in machine-readable format.
Two training experiments were performed. First, the SDF-StyleGAN was trained on the entire dataset of fossils collected at that time. This experiment revealed that the neural network struggled to learn with multi-domain data, resulting in very average outcomes. Consequently, a smaller dataset comprising only ammonite 3D scans was created for training. As anticipated, the neural network demonstrated improved performance with the single-domain dataset, successfully generating high-quality ammonite-like samples.
In this approach, we work with 2D representations of 3D models, where spatial information is encoded in grayscale images called depth maps. It has certain limitations, for example, a depth map only describes part of the object visible from a given viewpoint, not the whole thing. It is therefore best suited to surfaces and flat objects. However, the approach also has significant advantages. Because it works with images, it requires less computing power and allows you to work at relatively high resolution on consumer-grade GPUs. It also opens up access to a wide variety of neural networks designed for image generation. For this approach, two neural networks are being considered. StyleGAN2-ADA, recognized as one of the most efficient generative adversarial network architectures, will be utilized for depth map generation, training on depth maps obtained from scanned objects to create new ones.
Additionally, experimentation with CUT (Contrastive Unpaired Translation), a pix2pix-like architecture, is planned; it trains on pairs of images of different types and enables the conversion of one image type into another. In this case, CUT will be trained on pairs of depth maps and renders to facilitate texture reconstruction for generated objects. This method will also enable the conversion of photographs of fossil specimens into depth maps, presenting both artistic and scientific value by allowing for the study of the volumetric qualities of specimens archived only as photographs.
A custom TouchDesigner patch is being utilized to create the depth maps. In addition, a new version of the dataset creation tool is under development based on Blender, an open-source and cross-platform software. This tool aims to produce depth maps and renders of objects, using either default textures or predefined materials.
StyleGAN2-ADA was trained on the depth maps of the collected fossils, demonstrating effective performance with multi-domain datasets that include various types of fossils. The resultant images were then utilized for the manual reconstruction of the skull.
The first physical object of the series, titled CAS_201, has been successfully generated using a custom-developed AI-based methodology, processed, and 3D printed in polyamide. Below are four views of the object.
Video Installation Materials
Shows and events featuring Content Aware Studies II Documents of Synthetic Histories
Tokyo | Berlin | Vienna
Contact Email
mail[at]kraft.studio
Press Materials:
Copyright
Kraft Studio © 2023
Egor Kraft
Artist, Director & Founder
Anna Kraft
Production & Communication
Artem Konevskikh
Ai Research & Development