Human Generated Data

Title

Untitled (baby yawning)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17364

Human Generated Data

Title

Untitled (baby yawning)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17364

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Clothing 98.7
Apparel 98.7
Human 93.5
Baby 85.9
Person 83.3
Bonnet 80.9
Hat 80.9
Newborn 79.3
Text 55.6

Clarifai
created on 2023-10-29

monochrome 99.7
science 97.2
people 93.8
black and white 93.4
biology 91.8
man 90.8
medicine 90.3
body 88.6
desktop 88.6
abstract 88.3
art 87.7
jellyfish 87.4
nature 87.2
anatomy 85.2
mono 84.9
animal 82.9
shape 82.4
square 80.3
no person 80.2
underwater 77.7

Imagga
created on 2022-02-26

art 26.4
pattern 23.2
glass 20.9
smoke 20.7
motion 20.6
black 19.5
shape 18.6
design 18.6
curve 18.4
graphic 18.2
light 18
wave 17.3
color 17.2
transparent 17
3d 16.3
fractal 15.6
flame 15
film 14.8
swirl 14.7
space 14.7
digital 14.6
texture 13.9
science 12.5
backdrop 12.4
dynamic 12.3
decoration 11.9
mystical 11.8
love 11
lines 10.8
wallpaper 10.7
photographic paper 10.7
render 10.4
form 10.2
glow 10.2
symbol 10.1
bright 10
romance 9.8
device 9.8
backgrounds 9.7
technology 9.6
negative 9.6
flowing 9.3
yellow 9.3
energy 9.2
wedding 9.2
close 9.1
smooth 9.1
x-ray film 9.1
romantic 8.9
silk 8.9
incense 8.9
medical 8.8
engagement 8.7
burn 8.7
rose 8.6
biology 8.5
floral 8.5
creativity 8.4
frame 8.3
gold 8.2
effect 8.2
futuristic 8.1
concepts 8
shiny 7.9
artistic 7.8
trail 7.7
gift 7.7
circle 7.7
magic 7.6
unique 7.6
fire 7.5
blur 7.4
generated 7.4
flow 7.4
glowing 7.4
photographic equipment 7.2
graphics 7.2
fantasy 7.2
conceptual 7.1
container 7

Microsoft
created on 2022-02-26

text 95.9

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 34-42
Gender Male, 99.1%
Calm 52.2%
Surprised 10.3%
Fear 10.1%
Sad 7.6%
Confused 6.3%
Happy 5.8%
Angry 4.6%
Disgusted 3.2%

Feature analysis

Amazon

Person
Person 83.3%

Categories

Imagga

interior objects 35.9%
text visuals 31.8%
paintings art 24.8%
food drinks 6.1%

Captions

Microsoft
created on 2022-02-26

a person lying on a bed 29.4%