Human Generated Data

Title

Untitled (woman, man, and dog in athletic attire)

Date

1949

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16719

Human Generated Data

Title

Untitled (woman, man, and dog in athletic attire)

People

Artist: Lucian and Mary Brown, American

Date

1949

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 99.7
Human 99.7
Clothing 99.5
Apparel 99.5
Person 99.4
Face 93.3
Person 84
Shorts 84
Female 78.8
Pants 74.6
Sleeve 71.6
Portrait 67.5
Photography 67.5
Photo 67.5
Text 66
Door 64.2
Girl 64.1
Plant 57.9
Brick 57
Screen 56.7
Electronics 56.7
Display 56.7
Monitor 56.7
Hat 56.5
Woman 56.4
LCD Screen 56
Accessories 55.1
Accessory 55.1
Glasses 55.1

Imagga
created on 2022-02-26

negative 51.1
film 41.5
person 34.6
photographic paper 31.4
man 24.2
male 24.1
people 21.2
photographic equipment 20.9
adult 19.3
shower cap 19.1
player 18
black 18
athlete 17.9
clothing 17.6
cap 15.7
portrait 14.9
sport 14.5
ballplayer 13.5
happy 13.2
headdress 13
world 12
event 11.1
face 10.6
mask 10.5
equipment 10.3
park 9.9
costume 9.8
skill 9.6
happiness 9.4
art 9.3
contestant 9.3
training 9.2
dark 9.2
playing 9.1
dress 9
human 9
couple 8.7
teacher 8.5
pretty 8.4
fashion 8.3
flag 8.3
fun 8.2
style 8.2
music 8.1
suit 8.1
patient 8
celebration 8
to 8
love 7.9
smile 7.8
championship 7.8
play 7.7
match 7.7
one 7.5
silhouette 7.4
competition 7.3
exercise 7.3
bright 7.1
romantic 7.1
night 7.1
cool 7.1

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

Face analysis

Amazon

Google

AWS Rekognition

Age 29-39
Gender Male, 96%
Calm 92%
Surprised 7%
Angry 0.3%
Happy 0.2%
Disgusted 0.2%
Confused 0.1%
Fear 0.1%
Sad 0.1%

AWS Rekognition

Age 41-49
Gender Female, 70.8%
Happy 88.2%
Surprised 9.7%
Calm 0.7%
Confused 0.4%
Disgusted 0.3%
Fear 0.3%
Sad 0.3%
Angry 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%

Captions

Microsoft

a person standing next to a window 51.2%
a person standing in front of a window 51.1%
a group of people standing next to a window 51%

Text analysis

Amazon

ANAGER
COACH
sub2titool
KODVK-SVLELA
S.T

Google

|2ub2titol ENAGER CONCH YT33A2-XA
|2ub2titol
ENAGER
CONCH
YT33A2-XA