Human Generated Data

Title

Untitled (two men breaking ground at bridge as crowd watches)

Date

1953

People

Artist: Orrion Barger, American active 1913 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6320

Human Generated Data

Title

Untitled (two men breaking ground at bridge as crowd watches)

People

Artist: Orrion Barger, American active 1913 - 1984

Date

1953

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Clothing 99.6
Apparel 99.6
Dress 98.8
Person 98.7
Human 98.7
Person 98.4
Person 96.6
Person 95.7
Female 94.3
Person 91.7
Face 91.6
Person 89
Sunglasses 84.1
Accessories 84.1
Accessory 84.1
Woman 82.4
Costume 79.9
Overcoat 77
Suit 77
Coat 77
Photo 74.6
Portrait 74.6
Photography 74.6
People 72.3
Fashion 71.6
Crowd 71.4
Girl 67.5
Gown 67
Robe 66.5
Wedding 60.2
Person 59.5
Wedding Gown 57.2
Head 56.8
Kid 56
Child 56
Person 45.2

Imagga
created on 2022-01-22

iron lung 63.8
respirator 53.9
breathing device 40.2
device 32.3
astronaut 29.1
metal 16.9
technology 15.6
equipment 14.9
industry 13.6
industrial 13.6
man 12.1
engine 11.5
machine 11.3
old 11.1
military 10.6
black 10.2
power 10.1
city 10
retro 9.8
medicine 9.7
factory 9.6
chemical 9.6
light 9.3
adult 9
steel 8.8
urban 8.7
nuclear 8.7
work 8.6
engineering 8.6
part 8.5
wheel 8.5
mask 8.4
science 8
medical 7.9
business 7.9
radiation 7.8
people 7.8
mechanical 7.8
steam 7.7
modern 7.7
gas 7.7
war 7.7
person 7.4
safety 7.4
speed 7.3
design 7.3
protection 7.3
danger 7.3
suit 7.2
transportation 7.2

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

text 99.3
drawing 77
clothing 73.1
black and white 63.1
person 62.1
posing 40.4
clothes 29.2

Face analysis

Amazon

Google

AWS Rekognition

Age 45-51
Gender Male, 81.3%
Calm 62%
Happy 18.3%
Confused 5.3%
Fear 3.9%
Sad 3.9%
Disgusted 2.3%
Surprised 2.3%
Angry 1.8%

AWS Rekognition

Age 38-46
Gender Male, 98.1%
Calm 99.5%
Sad 0.2%
Surprised 0.1%
Fear 0.1%
Disgusted 0.1%
Angry 0.1%
Happy 0%
Confused 0%

AWS Rekognition

Age 36-44
Gender Male, 99.8%
Happy 84.8%
Sad 7.6%
Calm 2.9%
Surprised 2%
Confused 1.6%
Disgusted 0.4%
Fear 0.4%
Angry 0.4%

AWS Rekognition

Age 41-49
Gender Female, 96.5%
Angry 93.4%
Happy 3.5%
Disgusted 1.4%
Sad 0.8%
Calm 0.5%
Surprised 0.2%
Fear 0.1%
Confused 0.1%

AWS Rekognition

Age 43-51
Gender Female, 98.3%
Calm 69.4%
Happy 28%
Surprised 1.1%
Fear 0.4%
Disgusted 0.4%
Sad 0.3%
Angry 0.2%
Confused 0.2%

AWS Rekognition

Age 24-34
Gender Male, 71.6%
Calm 68.4%
Sad 8.1%
Angry 7.5%
Happy 6.5%
Confused 3.4%
Fear 2.8%
Disgusted 2%
Surprised 1.3%

AWS Rekognition

Age 35-43
Gender Male, 99.7%
Calm 53.5%
Happy 30.9%
Surprised 12.4%
Disgusted 1.1%
Confused 0.9%
Sad 0.5%
Fear 0.4%
Angry 0.4%

AWS Rekognition

Age 22-30
Gender Male, 99%
Calm 99.4%
Sad 0.3%
Confused 0.1%
Surprised 0.1%
Angry 0.1%
Happy 0.1%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 25-35
Gender Male, 99%
Calm 98.6%
Sad 1.2%
Surprised 0.1%
Confused 0.1%
Happy 0.1%
Angry 0%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 20-28
Gender Female, 80.7%
Sad 89.2%
Calm 5.8%
Happy 1.3%
Angry 1.1%
Confused 1%
Disgusted 0.6%
Surprised 0.6%
Fear 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Feature analysis

Amazon

Person 98.7%
Sunglasses 84.1%

Captions

Microsoft

a group of people posing for a photo 59%
a group of people posing for the camera 58.9%
a group of people posing for a picture 58.8%

Text analysis

Amazon

975
AB
for
331-2-
YT330°

Google

ror
ror