Human Generated Data

Title

Unemployed trapper, Plaquemines Parish, Louisiana

Date

1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.3089

Human Generated Data

Title

Unemployed trapper, Plaquemines Parish, Louisiana

People

Artist: Ben Shahn, American 1898 - 1969

Date

1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.3089

Machine Generated Data

Tags

Amazon
created on 2021-12-15

Person 98.9
Human 98.9
Person 97.6
Person 91.5
Building 84.9
Clothing 81.3
Apparel 81.3
Urban 80
Painting 78.5
Art 78.5
Person 75.2
People 72.3
Nature 65.6
Housing 64.7
Outdoors 62.6
Countryside 61.6
Sitting 56.7
Furniture 56.4
Wood 56.1
Shack 55.2
Rural 55.2
Hut 55.2

Clarifai
created on 2023-10-15

people 99.9
child 99
adult 98.4
two 97.8
furniture 97.1
man 96.7
three 96.3
boy 96
portrait 94.8
group 94.8
family 94
son 94
group together 93.1
woman 91
four 90.7
wear 89.7
bench 89.3
offspring 88.3
one 81.1
sit 80.7

Imagga
created on 2021-12-15

kin 42.8
man 31.6
people 29
male 28.5
adult 27.9
person 27.3
couple 27
happy 26.9
family 26.7
stretcher 24.8
home 24.7
love 23.7
mother 22.7
portrait 22.6
couch 21.2
happiness 21.1
lifestyle 20.2
together 20.1
litter 19.8
sitting 19.8
smiling 18.8
smile 18.5
sofa 18.4
indoor 18.3
father 17
indoors 16.7
husband 15.4
casual 15.2
conveyance 15.1
room 14.4
old 13.9
child 13.8
relaxing 13.6
aged 13.6
attractive 13.3
relaxed 13.1
life 13
mature 13
face 12.8
parent 12.6
cheerful 12.2
women 11.9
two 11.9
wife 11.4
togetherness 11.3
senior 11.2
computer 11.2
looking 11.2
men 11.2
son 10.9
patient 10.8
leisure 10.8
daughter 10.6
30s 10.6
fun 10.5
living 10.4
joy 10
group 9.7
comfort 9.6
elderly 9.6
boy 9.6
grandfather 9.4
day 9.4
dad 9.4
cute 9.3
horizontal 9.2
house 9.2
human 9
handsome 8.9
newspaper 8.9
living room 8.8
mid adult 8.7
married 8.6
clothing 8.4
relax 8.4
color 8.3
fashion 8.3
laptop 8.2
lady 8.1
romantic 8
interior 8
affection 7.7
pretty 7.7
relationship 7.5
alone 7.3
romance 7.1
kid 7.1

Google
created on 2021-12-15

Microsoft
created on 2021-12-15

person 92.2
drawing 89.5
text 88.3
clothing 87.9
old 82.7
sketch 80.6
furniture 80.2
human face 78.9
black and white 70.1
man 65.4
table 64.7
sitting 60.8

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 48-66
Gender Male, 96.3%
Calm 95.6%
Sad 1.7%
Happy 0.9%
Confused 0.6%
Surprised 0.5%
Disgusted 0.3%
Angry 0.2%
Fear 0.1%

AWS Rekognition

Age 22-34
Gender Female, 97.4%
Fear 80.7%
Calm 11.5%
Sad 5.4%
Surprised 0.9%
Confused 0.7%
Angry 0.5%
Happy 0.3%
Disgusted 0.1%

AWS Rekognition

Age 25-39
Gender Male, 92.3%
Calm 99.2%
Happy 0.5%
Sad 0.2%
Angry 0%
Disgusted 0%
Surprised 0%
Confused 0%
Fear 0%

Microsoft Cognitive Services

Age 24
Gender Female

Microsoft Cognitive Services

Age 44
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.9%
Painting 78.5%

Categories

Captions