Human Generated Data

Title

Untitled (medicine show, Huntingdon, Tennessee)

Date

October 1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1431

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (medicine show, Huntingdon, Tennessee)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1431

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Clothing 100
Hat 100
Coat 100
Sun Hat 99.7
Male 98
Man 98
Person 98
Adult 98
Male 97.9
Man 97.9
Person 97.9
Adult 97.9
Adult 97.1
Male 97.1
Man 97.1
Person 97.1
Adult 95.8
Male 95.8
Man 95.8
Person 95.8
People 95.7
Person 95.2
Person 95.1
Person 95
Adult 94.6
Male 94.6
Man 94.6
Person 94.6
Person 94.3
Baby 94.3
Person 93.6
Adult 93.2
Male 93.2
Man 93.2
Person 93.2
Person 93.2
Person 92.5
Adult 92
Male 92
Man 92
Person 92
Person 92
Face 90.2
Head 90.2
Person 89.7
Adult 88.5
Male 88.5
Man 88.5
Person 88.5
Person 87.4
Person 86.6
Cap 84.9
Person 80.1
Machine 79.6
Wheel 79.6
Wheel 79.4
Person 77.8
Person 73.7
Crowd 70.7
Wheel 70.6
Person 64.7
Overcoat 55.6
Car 55
Transportation 55
Vehicle 55

Clarifai
created on 2018-05-11

people 99.8
group together 99.1
group 98.5
many 98.4
adult 96.6
military 95.5
vehicle 94.4
war 93.7
chair 93
administration 92.8
leader 91
soldier 89.6
man 89.3
crowd 89.2
police 88.7
outfit 87.7
woman 86.2
several 85.6
wear 85.1
uniform 82.6

Imagga
created on 2023-10-06

man 25.5
city 22.4
person 21.4
people 21.2
outdoor 19.1
building 18.5
adult 18.2
urban 17.5
street 16.6
male 15.8
uniform 14.2
portrait 12.9
looking 12.8
one 12.7
clothing 12.1
snow 12.1
men 12
work 12
outdoors 11.9
old 11.8
architecture 11.7
pedestrian 11.6
industry 11.1
winter 11.1
safety 11
protection 10.9
industrial 10.9
leisure 10.8
travel 10.6
life 10.4
lifestyle 10.1
helmet 10
transportation 9.9
military uniform 9.8
human 9.7
walking 9.5
cold 9.5
seller 9.1
history 8.9
sky 8.9
to 8.8
disaster 8.8
vehicle 8.5
youth 8.5
stretcher 8.4
passenger 8.3
danger 8.2
destruction 7.8
backpack 7.8
sitting 7.7
sport 7.7
outside 7.7
construction 7.7
cityscape 7.6
vacation 7.4
business 7.3
road 7.2
activity 7.2
conveyance 7.1
gun 7

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

person 99.6
people 83.4
group 80.1
crowd 0.9

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 43-51
Gender Male, 100%
Calm 98.9%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Angry 0.4%
Confused 0.3%
Happy 0.1%
Disgusted 0.1%

AWS Rekognition

Age 23-31
Gender Male, 100%
Calm 99.9%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Confused 0%
Angry 0%
Disgusted 0%
Happy 0%

AWS Rekognition

Age 20-28
Gender Male, 99.5%
Calm 92.7%
Surprised 6.3%
Fear 5.9%
Angry 5.8%
Sad 2.3%
Confused 0.7%
Disgusted 0.1%
Happy 0%

AWS Rekognition

Age 22-30
Gender Male, 100%
Calm 93.7%
Surprised 6.4%
Fear 5.9%
Confused 2.7%
Sad 2.2%
Angry 2%
Happy 0.5%
Disgusted 0.4%

AWS Rekognition

Age 43-51
Gender Male, 100%
Sad 100%
Surprised 6.3%
Calm 6%
Fear 6%
Angry 2.9%
Happy 0.5%
Confused 0.2%
Disgusted 0.2%

AWS Rekognition

Age 48-56
Gender Male, 99.9%
Calm 66.6%
Sad 25.3%
Surprised 9.4%
Fear 6.8%
Angry 2.6%
Confused 1.9%
Disgusted 1.4%
Happy 0.8%

AWS Rekognition

Age 6-14
Gender Female, 99.8%
Calm 68.1%
Confused 12.2%
Surprised 7.9%
Fear 6.8%
Disgusted 4.6%
Sad 4.4%
Happy 2.3%
Angry 2.3%

AWS Rekognition

Age 34-42
Gender Male, 99.7%
Calm 92.2%
Surprised 6.4%
Fear 5.9%
Sad 3.9%
Angry 1.9%
Confused 0.8%
Disgusted 0.3%
Happy 0.2%

AWS Rekognition

Age 19-27
Gender Female, 52.5%
Calm 88.1%
Surprised 7.1%
Fear 6%
Angry 5.6%
Sad 2.5%
Confused 2.2%
Disgusted 0.5%
Happy 0.4%

AWS Rekognition

Age 26-36
Gender Male, 99.8%
Calm 78.9%
Surprised 11.6%
Fear 6.2%
Angry 4.9%
Happy 4.2%
Sad 2.4%
Disgusted 1.6%
Confused 0.9%

AWS Rekognition

Age 35-43
Gender Male, 99.6%
Happy 54.3%
Calm 26.5%
Surprised 7.2%
Fear 6.9%
Disgusted 5.4%
Sad 5.2%
Confused 1.6%
Angry 1.4%

AWS Rekognition

Age 21-29
Gender Male, 90.8%
Sad 99.4%
Disgusted 15.2%
Fear 10.3%
Surprised 8%
Calm 5.4%
Confused 2.9%
Angry 1.3%
Happy 0.6%

AWS Rekognition

Age 23-31
Gender Male, 99.9%
Confused 98.2%
Surprised 6.3%
Fear 5.9%
Sad 2.5%
Calm 0.3%
Disgusted 0%
Angry 0%
Happy 0%

AWS Rekognition

Age 23-31
Gender Male, 98%
Calm 70.1%
Sad 29.1%
Surprised 6.5%
Angry 6.3%
Fear 6.3%
Happy 0.8%
Confused 0.4%
Disgusted 0.3%

AWS Rekognition

Age 26-36
Gender Male, 98.9%
Calm 96.6%
Surprised 6.8%
Fear 5.9%
Sad 2.3%
Happy 0.7%
Disgusted 0.5%
Confused 0.4%
Angry 0.2%

AWS Rekognition

Age 19-27
Gender Male, 97%
Calm 61%
Disgusted 22.3%
Confused 8.4%
Surprised 7.3%
Fear 6.3%
Sad 3.1%
Angry 1.7%
Happy 1%

Microsoft Cognitive Services

Age 27
Gender Female

Microsoft Cognitive Services

Age 27
Gender Male

Microsoft Cognitive Services

Age 65
Gender Male

Microsoft Cognitive Services

Age 24
Gender Female

Microsoft Cognitive Services

Age 40
Gender Male

Microsoft Cognitive Services

Age 42
Gender Male

Microsoft Cognitive Services

Age 31
Gender Male

Microsoft Cognitive Services

Age 20
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Male 98%
Man 98%
Person 98%
Adult 98%
Baby 94.3%
Wheel 79.6%
Car 55%

Text analysis

Amazon

-
- value
I
TRENNOR
value