Human Generated Data

Title

Untitled (family holding sporting equipment in front of house)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17078

Human Generated Data

Title

Untitled (family holding sporting equipment in front of house)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 99.7
Human 99.7
Person 99.6
Person 99.6
Person 99.6
Footwear 99.5
Apparel 99.5
Shoe 99.5
Clothing 99.5
Shoe 99.4
Person 98.9
Shorts 98.9
Shoe 93.1
People 90.1
Racket 88.6
Tennis Racket 88.6
Shoe 80.2
Face 77.4
Female 73.6
Pants 71.8
Suit 68.4
Overcoat 68.4
Coat 68.4
Kid 65.8
Child 65.8
Outdoors 65.1
Dress 63.2
Photography 62.5
Photo 62.5
Leisure Activities 59.8
Door 59.7
Brick 58.6
Girl 57.3
Play 56.2
Building 55.9
Housing 55.9
Family 55.1
Costume 55

Imagga
created on 2022-02-26

musical instrument 55.2
wind instrument 45
accordion 39.7
keyboard instrument 31.8
man 30.2
sax 24.4
male 22.7
silhouette 20.7
people 20.1
brass 19.8
person 18.6
athlete 17.1
black 16.2
sport 15.4
sunset 15.3
ballplayer 14.2
player 14.1
outdoors 13.4
trombone 12.2
couple 12.2
adult 11.9
lifestyle 11.6
outdoor 11.5
boy 11.3
contestant 11.3
leisure 10.8
run 10.6
businessman 10.6
kin 10.4
play 10.3
love 10.3
sky 10.2
device 9.5
walking 9.5
beach 9.3
business 9.1
exercise 9.1
park 9.1
day 8.6
youth 8.5
relax 8.4
portrait 8.4
active 8.4
dark 8.3
holding 8.3
happy 8.1
together 7.9
men 7.7
sitting 7.7
summer 7.7
human 7.5
group 7.3
women 7.1

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

text 99.1
outdoor 98.4
clothing 92
person 91.2
man 88.1
musical instrument 85.5

Face analysis

Amazon

Google

AWS Rekognition

Age 36-44
Gender Male, 86.8%
Calm 86.7%
Happy 11.5%
Sad 0.9%
Surprised 0.3%
Confused 0.2%
Fear 0.2%
Disgusted 0.1%
Angry 0.1%

AWS Rekognition

Age 42-50
Gender Male, 99.8%
Happy 90.5%
Calm 3%
Sad 1.7%
Surprised 1.5%
Confused 1.5%
Disgusted 0.6%
Fear 0.6%
Angry 0.5%

AWS Rekognition

Age 48-56
Gender Male, 96.2%
Happy 91.6%
Calm 5.6%
Surprised 1.5%
Sad 0.6%
Disgusted 0.2%
Fear 0.2%
Confused 0.1%
Angry 0.1%

AWS Rekognition

Age 21-29
Gender Male, 76.3%
Sad 39.3%
Happy 36.7%
Calm 12.1%
Surprised 8%
Confused 1.9%
Fear 0.9%
Angry 0.5%
Disgusted 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Shoe 99.5%

Captions

Microsoft

a man standing in front of a building 88.6%
a man and a woman standing in front of a building 76.8%
a group of people standing in front of a building 76.7%

Text analysis

Amazon

5
KODAK-2-ITW

Google

MJI7-- YT37A°2--AGOX
MJI7--
YT37A°2--AGOX