Human Generated Data

Title

Untitled (boys sitting on fence)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17616

Human Generated Data

Title

Untitled (boys sitting on fence)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 99.7
Human 99.7
Person 99.5
Person 99.2
Clothing 98.7
Apparel 98.7
Person 95.4
Shoe 93.9
Footwear 93.9
Shoe 89.8
Shoe 85.1
Face 83.1
Pants 82.5
Coat 73.8
Outdoors 72.8
Nature 69.7
Photography 68.9
Portrait 68.9
Photo 68.9
Sleeve 67.6
Man 66.1
Overcoat 64.8
Suit 64.8
People 61.6
Female 60.8
Art 59.6
Drawing 59.6
Leisure Activities 58.5
Shoe 58.1
Plant 57.1

Imagga
created on 2022-02-26

sax 81.1
wind instrument 33.2
man 30.2
male 23.4
people 22.9
person 22.3
brass 21.7
adult 17.2
men 15.4
sport 14.2
business 14
active 13.8
businessman 13.2
musical instrument 13
lifestyle 13
black 12.6
professional 12.5
exercise 11.8
human 10.5
portrait 10.3
summer 10.3
leisure 10
outdoor 9.9
suit 9.9
couple 9.6
boy 9.6
life 9.5
work 9.5
corporate 9.4
motion 9.4
action 9.3
worker 9
outdoors 8.9
sky 8.9
success 8.8
job 8.8
happy 8.8
wall 8.5
smile 8.5
guy 8.4
manager 8.4
health 8.3
pose 8.1
fitness 8.1
new 8.1
team 8.1
group 8.1
equipment 8
building 8
youth 7.7
beach 7.6
fun 7.5
company 7.4
art 7.2
handsome 7.1
drawing 7.1
happiness 7

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

outdoor 97.8
drawing 95.5
man 93.2
text 92.3
sketch 89.2
person 86
clothing 76.5
black and white 60.7

Face analysis

Amazon

Google

AWS Rekognition

Age 30-40
Gender Male, 98.5%
Calm 64.6%
Sad 29.2%
Confused 1.7%
Happy 1.4%
Disgusted 0.9%
Fear 0.8%
Angry 0.7%
Surprised 0.7%

AWS Rekognition

Age 16-24
Gender Male, 80%
Calm 52.8%
Happy 35.9%
Confused 2.9%
Sad 2.3%
Fear 2.1%
Disgusted 1.9%
Surprised 1.6%
Angry 0.6%

AWS Rekognition

Age 26-36
Gender Male, 79.6%
Calm 92.9%
Surprised 2.9%
Sad 2.3%
Happy 0.5%
Confused 0.4%
Angry 0.4%
Disgusted 0.3%
Fear 0.2%

AWS Rekognition

Age 49-57
Gender Male, 92.3%
Happy 75.4%
Calm 21.9%
Confused 1.4%
Surprised 0.4%
Disgusted 0.4%
Angry 0.2%
Sad 0.1%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Shoe 93.9%

Captions

Microsoft

a group of people standing in front of a building 91%
a group of people posing for a photo 87.6%
a group of men standing in front of a building 87.5%

Text analysis

Amazon

DEI
KODAK

Google

YT37A°2-A
YT37A°2-A