Human Generated Data

Title

Untitled (Middleboro, Kentucky)

Date

October 1935, printed later

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Museum Acquisition, P1970.3356

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Middleboro, Kentucky)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935, printed later

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-12-15

Person 99.8
Human 99.8
Apparel 99.8
Clothing 99.8
Person 99.4
Person 99.4
Person 99.3
Person 99
Coat 98.9
Overcoat 98.9
Person 98.3
Person 92
Tuxedo 82.6
Suit 69.1
People 64.4
Sun Hat 63
Hat 58.5

Imagga
created on 2021-12-15

man 36.3
male 30.5
people 25.6
adult 21.4
uniform 19.6
person 19
business 18.8
worker 18.4
professional 17.1
helmet 16.8
construction 16.2
clothing 15.9
architecture 15.6
building 15.3
work 13.4
builder 13.4
engineer 13.3
hat 13.3
job 13.3
manager 13
occupation 12.8
war 12.6
military 12.5
team 12.5
city 12.5
businessman 12.3
portrait 12.3
urban 12.2
men 12
industry 11.9
soldier 11.7
architect 11.6
suit 11.4
gun 11.3
handsome 10.7
happy 10.6
guy 10.5
group 10.5
success 10.4
corporate 10.3
safety 10.1
smiling 10.1
foreman 10.1
black 9.8
outdoors 9.7
military uniform 9.5
teamwork 9.3
weapon 8.8
contractor 8.7
boy 8.7
engineering 8.6
smile 8.5
office 8.5
site 8.4
modern 8.4
outdoor 8.4
human 8.2
successful 8.2
looking 8
mask 7.9
army 7.8
outside 7.7
attractive 7.7
old 7.7
sky 7.6
sport 7.5
protection 7.3
lifestyle 7.2
women 7.1
to 7.1
working 7.1
equipment 7

Google
created on 2021-12-15

Microsoft
created on 2021-12-15

person 99.5
clothing 97.4
outdoor 95.9
text 94.2
man 89.8
people 76.2
group 55.3

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 26-40
Gender Male, 94.2%
Calm 90.2%
Sad 6.5%
Fear 1.1%
Happy 0.9%
Angry 0.6%
Surprised 0.5%
Confused 0.2%
Disgusted 0%

AWS Rekognition

Age 16-28
Gender Male, 97.1%
Calm 76.3%
Sad 16.8%
Confused 4.3%
Fear 1.1%
Angry 1%
Happy 0.2%
Surprised 0.2%
Disgusted 0%

AWS Rekognition

Age 38-56
Gender Male, 74.8%
Calm 80.4%
Sad 5.1%
Surprised 4.5%
Confused 4.1%
Angry 3.7%
Fear 1.2%
Happy 0.9%
Disgusted 0.2%

AWS Rekognition

Age 26-40
Gender Male, 52.9%
Calm 77.9%
Surprised 14.4%
Happy 4.4%
Angry 1.5%
Sad 0.7%
Confused 0.6%
Fear 0.3%
Disgusted 0.2%

AWS Rekognition

Age 25-39
Gender Male, 99.9%
Calm 66.6%
Surprised 18.1%
Angry 7.7%
Confused 3.5%
Sad 2.8%
Fear 0.6%
Happy 0.3%
Disgusted 0.3%

AWS Rekognition

Age 24-38
Gender Male, 84.5%
Calm 93.6%
Sad 2.5%
Angry 1.3%
Fear 1%
Happy 0.6%
Surprised 0.5%
Disgusted 0.3%
Confused 0.2%

Microsoft Cognitive Services

Age 56
Gender Male

Microsoft Cognitive Services

Age 37
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%
Suit 69.1%
Hat 58.5%

Captions

Microsoft

a group of people standing in front of a building 95.8%
a group of people standing outside of a building 95.7%
a group of people standing next to a building 95.6%

Text analysis

Amazon

BLES
CASH
CIT
D
PUITS

Google

BLES
CIT CASH QUITS BLES
CIT
CASH
QUITS