Human Generated Data

Title

Untitled (Crossville, Tennessee)

Date

1937

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Museum Acquisition, P1970.3520

Human Generated Data

Title

Untitled (Crossville, Tennessee)

People

Artist: Ben Shahn, American 1898 - 1969

Date

1937

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Museum Acquisition, P1970.3520

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 99.7
Human 99.7
Person 99.6
Person 99.5
Person 99.5
Person 99.4
Nature 99.3
Apparel 98.9
Clothing 98.9
Outdoors 98.9
Shack 95
Rural 95
Hut 95
Countryside 95
Building 95
Shoe 92.9
Footwear 92.9
Person 88.4
Shoe 86.9
Dugout 76.5
Person 76.5
Shoe 71
Cap 67.6
Helmet 63.3
People 60.9
Hat 60.7
Hat 60
Pants 59.7

Clarifai
created on 2023-10-25

people 99.9
group 99.4
group together 99.3
man 97.2
uniform 94.9
adult 93.8
veil 93.8
military 93.4
many 92.8
lid 92.5
child 92.2
soldier 92.1
outfit 91
boy 90.3
woman 90.1
police 89.9
war 88.8
wear 86.5
administration 82.2
leader 79.6

Imagga
created on 2022-01-08

man 26.3
city 24.9
people 23.4
person 22.4
men 19.7
male 16.4
adult 15.6
hairdresser 15.4
urban 14.8
shop 14.8
musical instrument 13.8
photographer 13
old 11.8
world 11.7
women 11.1
historic 11
architecture 10.9
barbershop 10.7
black 10.2
street 10.1
building 10
portrait 9.7
room 9.7
indoors 9.7
mercantile establishment 9.6
lifestyle 9.4
life 9.1
business 9.1
human 9
outdoors 9
history 8.9
new 8.9
happy 8.8
ancient 8.6
work 8.6
day 8.6
uniform 8.6
two 8.5
safety 8.3
clothing 8.2
family 8
worker 8
home 8
scene 7.8
travel 7.7
industry 7.7
statue 7.6
passenger 7.6
house 7.5
drum 7.5
one 7.5
style 7.4
occupation 7.3
holiday 7.2
to 7.1

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

person 99.7
clothing 99.3
man 95.9
text 90.4
black and white 52.3

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 31-41
Gender Male, 99%
Calm 50.1%
Confused 35.4%
Angry 8%
Disgusted 2.4%
Surprised 1.8%
Sad 1.3%
Happy 0.6%
Fear 0.3%

AWS Rekognition

Age 47-53
Gender Male, 100%
Calm 47.6%
Confused 24.9%
Angry 21.1%
Sad 3.5%
Disgusted 1.6%
Happy 0.6%
Surprised 0.4%
Fear 0.4%

AWS Rekognition

Age 34-42
Gender Male, 98.5%
Calm 70.6%
Happy 14.3%
Disgusted 5.5%
Angry 3.6%
Confused 3.1%
Sad 1.5%
Surprised 0.8%
Fear 0.6%

AWS Rekognition

Age 25-35
Gender Male, 93.7%
Calm 99.8%
Sad 0.1%
Surprised 0%
Disgusted 0%
Confused 0%
Happy 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 36-44
Gender Male, 98.8%
Calm 77.3%
Angry 11%
Sad 4.9%
Confused 3.8%
Surprised 1.5%
Disgusted 0.6%
Happy 0.5%
Fear 0.4%

Microsoft Cognitive Services

Age 39
Gender Male

Microsoft Cognitive Services

Age 34
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Shoe 92.9%
Helmet 63.3%
Hat 60.7%

Categories