Human Generated Data

Title

Untitled (auction, New Carlisle, Ohio)

Date

July 30, 1938

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.786

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (auction, New Carlisle, Ohio)

People

Artist: Ben Shahn, American 1898 - 1969

Date

July 30, 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.786

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Clothing 100
Sun Hat 100
Person 99.5
Adult 99.5
Male 99.5
Man 99.5
Person 99.3
Adult 99.3
Male 99.3
Man 99.3
Person 99.3
Adult 99.3
Male 99.3
Man 99.3
Person 99.2
Adult 99.2
Male 99.2
Man 99.2
Person 99.2
Adult 99.2
Male 99.2
Man 99.2
Man 99.2
Person 99.2
Adult 99.2
Male 99.2
Person 99.1
Architecture 99
Building 99
House 99
Housing 99
Porch 99
Man 98.9
Person 98.9
Adult 98.9
Male 98.9
Footwear 94.5
Shoe 94.5
Shoe 92.4
Accessories 87
Formal Wear 87
Tie 87
Shoe 85.9
Shoe 85.4
Tie 83.4
Hat 77.9
Face 73.2
Head 73.2
Shoe 73.1
Hat 72.7
Outdoors 71.6
Shoe 70.3
Glasses 64.8
Shoe 61.6
Shoe 58.3
People 57.1
Shorts 56.6
Nature 56.3
Pants 56
Sitting 55.7
Cowboy Hat 55.1

Clarifai
created on 2018-05-11

people 100
group together 99.5
group 99.3
adult 99
home 97.9
several 97
man 96.4
child 96.3
four 96
many 95.6
woman 94.5
five 93.8
wear 93.7
position 91.2
three 90.8
boy 89.4
family 89.1
two 88.9
offspring 86.3
sibling 85.4

Imagga
created on 2023-10-06

man 30.2
people 26.8
person 25.4
male 22.1
weapon 21.7
athlete 21.6
player 20.8
sword 19.6
ballplayer 19.5
sport 18.7
cricket equipment 17.9
sports equipment 17
wicket 16.8
uniform 16.3
stretcher 16
equipment 15.8
adult 15
outdoors 14.9
child 14.4
boy 13.9
contestant 13.9
competition 13.7
family 13.3
day 13.3
walking 13.2
outdoor 13
ball 12.9
men 12.9
litter 12.8
couple 12.2
military 11.6
old 11.1
grass 11.1
clothing 10.8
soldier 10.7
happy 10.6
kin 10.5
walk 10.5
helmet 10.5
portrait 10.3
world 10.3
conveyance 10.1
protection 10
holding 9.9
fun 9.7
war 9.6
standing 9.6
love 9.5
smiling 9.4
street 9.2
playing 9.1
girls 9.1
together 8.8
happiness 8.6
industry 8.5
two 8.5
senior 8.4
summer 8.4
leisure 8.3
rifle 8.2
recreation 8.1
history 8
lifestyle 7.9
gun 7.9
mother 7.7
culture 7.7
school 7.6
statue 7.6
tradition 7.4
active 7.3
exercise 7.3
home 7.2
game 7.1
smile 7.1
kid 7.1
to 7.1
worker 7.1
travel 7

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

person 99.5
building 99.2
outdoor 97.1
black 70.3
group 66.6

Color Analysis

Face analysis

Amazon

Microsoft

AWS Rekognition

Age 52-60
Gender Female, 99.7%
Calm 71.9%
Sad 11.3%
Fear 7.9%
Surprised 7.1%
Confused 3.1%
Happy 2.7%
Angry 2.6%
Disgusted 1%

AWS Rekognition

Age 48-54
Gender Female, 62%
Calm 74%
Sad 27%
Surprised 6.4%
Fear 5.9%
Angry 4.7%
Confused 0.6%
Happy 0.3%
Disgusted 0.2%

AWS Rekognition

Age 63-73
Gender Male, 100%
Calm 96.2%
Surprised 6.5%
Fear 5.9%
Sad 2.7%
Angry 0.5%
Confused 0.5%
Happy 0.3%
Disgusted 0.2%

AWS Rekognition

Age 48-54
Gender Male, 100%
Sad 99.7%
Confused 26.3%
Surprised 6.5%
Fear 6%
Calm 3.3%
Disgusted 2.3%
Angry 0.3%
Happy 0.2%

AWS Rekognition

Age 59-67
Gender Male, 100%
Confused 49.8%
Sad 26.7%
Disgusted 18.3%
Surprised 7.2%
Calm 6.3%
Fear 6.2%
Angry 2.4%
Happy 1%

AWS Rekognition

Age 50-58
Gender Male, 99.7%
Calm 99.3%
Surprised 6.3%
Fear 5.9%
Sad 2.3%
Angry 0.1%
Confused 0.1%
Happy 0%
Disgusted 0%

AWS Rekognition

Age 59-67
Gender Male, 100%
Calm 66.5%
Confused 14.2%
Surprised 12.5%
Fear 6.1%
Sad 4.1%
Disgusted 3%
Angry 1.2%
Happy 0.8%

Microsoft Cognitive Services

Age 38
Gender Male

Feature analysis

Amazon

Person 99.5%
Adult 99.5%
Male 99.5%
Man 99.5%
Shoe 94.5%
Tie 87%
Hat 77.9%
Glasses 64.8%