Human Generated Data

Title

Untitled (woman holding wrapped boxes)

Date

1949

People

Artist: Robert Burian, American active 1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19348

Human Generated Data

Title

Untitled (woman holding wrapped boxes)

People

Artist: Robert Burian, American active 1940s-1950s

Date

1949

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19348

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 99
Human 99
Clothing 91.4
Apparel 91.4
Leisure Activities 72.8
Female 68.4
Sleeve 64.9
Musical Instrument 63.9
Girl 60.5
Long Sleeve 59.5
Shorts 58.8
Drum 56.7
Percussion 56.7
Accessories 55.9
Accessory 55.9
Frisbee 55.8
Toy 55.8
Curtain 55.7

Clarifai
created on 2023-10-22

people 99.8
wear 98.9
portrait 98.9
adult 98.9
one 98.5
music 98.1
costume 97.9
theater 97.1
outfit 96.1
woman 95.5
musician 94.5
opera 93.8
man 93.2
dress 92.7
veil 91.4
comedy 91
retro 86.5
singer 84.8
party 84.7
facial expression 84.6

Imagga
created on 2022-03-05

musical instrument 37
portrait 28.5
person 27.2
people 26.8
adult 26.4
man 23.5
male 21.3
black 21.2
wind instrument 20.1
accordion 19.9
fashion 18.8
face 17.7
sexy 17.7
dress 17.2
men 16.3
keyboard instrument 15.9
pretty 14.7
hair 13.5
style 13.3
attractive 13.3
looking 12.8
one 12.7
old 12.5
human 12
model 11.7
mask 11.3
percussion instrument 11.1
art 11.1
steel drum 11.1
alone 10.9
smile 10.7
clothes 10.3
women 10.3
lifestyle 10.1
book 10.1
dark 10
make 10
modern 9.8
room 9.8
urban 9.6
professional 9.4
concertina 9.2
vintage 9.1
costume 9
religion 9
happy 8.8
life 8.6
youth 8.5
casual 8.5
elegance 8.4
hand 8.3
tradition 8.3
clothing 8.3
holding 8.2
retro 8.2
posing 8
worker 8
free-reed instrument 7.8
eyes 7.7
sad 7.7
wall 7.7
serious 7.6
studio 7.6
religious 7.5
traditional 7.5
city 7.5
stylish 7.2
body 7.2
home 7.2
interior 7.1
indoors 7
device 7

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

human face 95.7
person 94.9
text 92.4
clothing 90.1
black 81.5
white 79.8
woman 72
smile 66.5
posing 65

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 27-37
Gender Male, 82.5%
Happy 66.8%
Surprised 21.6%
Calm 6.6%
Disgusted 1.7%
Fear 1.4%
Confused 0.8%
Sad 0.7%
Angry 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 99%

Captions

Text analysis

Amazon

S8
YТ3-XAX

Google

YT37A2-XA S8
YT37A2-XA
S8