Human Generated Data

Title

Untitled (woman in garden)

Date

c. 1966

People

Artist: Robert Burian, American active 1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19230

Human Generated Data

Title

Untitled (woman in garden)

People

Artist: Robert Burian, American active 1940s-1950s

Date

c. 1966

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19230

Machine Generated Data

Tags

Amazon
created on 2022-02-25

Clothing 96.8
Apparel 96.8
Person 96.6
Human 96.6
Sleeve 88.9
Evening Dress 80.1
Gown 80.1
Robe 80.1
Fashion 80.1
Female 60.8
Door 60.3
Long Sleeve 59.2
Advertisement 57.8
Home Decor 56.9
Text 55.8

Clarifai
created on 2023-10-22

people 99
woman 98
portrait 96.6
fashion 95.3
street 95
girl 94.7
wedding 94.5
adult 94.2
one 93.8
art 92.2
umbrella 91.7
dress 91.5
love 91.1
model 89.9
child 89.6
man 88.3
sit 87.7
old 87.7
door 86.1
step 85.7

Imagga
created on 2022-02-25

support 27.1
device 21.7
black 20.4
portrait 16.2
step 15.6
old 14.6
stairs 12.9
culture 12.8
man 12.3
bookend 12
art 11.9
dress 11.7
house 11.7
person 11
vintage 10.7
building 10.7
people 10.6
structure 10.4
alone 10
sill 10
city 10
window 9.8
adult 9.8
pretty 9.8
barrier 9.7
home 9.6
design 9.6
structural member 9.2
book jacket 9.1
attractive 9.1
history 8.9
wall 8.8
symbol 8.7
sitting 8.6
face 8.5
architecture 8.3
style 8.2
jacket 8.1
smiling 8
hair 7.9
urban 7.9
room 7.8
ancient 7.8
travel 7.7
entrance 7.7
youth 7.7
post 7.6
happy 7.5
fun 7.5
one 7.5
outdoors 7.5
tourism 7.4
retro 7.4
lifestyle 7.2
male 7.2
obstruction 7.2

Google
created on 2022-02-25

Microsoft
created on 2022-02-25

text 99.7
clothing 96.9
person 94.4
toddler 79.6
baby 67.5
dress 66.6
picture frame 47.1

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 18-24
Gender Female, 99%
Calm 84.6%
Happy 13.5%
Surprised 0.5%
Angry 0.4%
Confused 0.4%
Fear 0.2%
Disgusted 0.2%
Sad 0.2%

Microsoft Cognitive Services

Age 23
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 96.6%

Categories

Captions

Text analysis

Amazon

133
23-3

Google

233 13 .... ...... ....m
233
13
....
......
....m