Human Generated Data

Title

Destitute tenant farmer's family, Ozark Mountains, Arkansas

Date

1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.3069

Human Generated Data

Title

Destitute tenant farmer's family, Ozark Mountains, Arkansas

People

Artist: Ben Shahn, American 1898 - 1969

Date

1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.3069

Machine Generated Data

Tags

Amazon
created on 2021-12-15

Furniture 97.7
Person 97.5
Human 97.5
Sitting 93.2
Chair 91.3
Clothing 83.5
Apparel 83.5
Face 73.7
Art 63.7
Painting 63.7
Photography 63.4
Portrait 63.4
Photo 63.4
Worker 56.9

Clarifai
created on 2023-10-15

people 100
adult 99.5
portrait 99.5
two 98.7
one 97.5
woman 97.2
three 96.6
man 96.6
child 96.4
sit 96.2
furniture 94.8
wear 92.9
elderly 91.8
family 90.6
room 89.7
son 89.5
group 88
seat 86.5
boy 85.5
retro 85.4

Imagga
created on 2021-12-15

child 19.9
man 19.5
people 16.7
old 16.7
outdoors 15.7
person 15.5
adult 14.9
male 14.9
bench 13.7
scale 13.4
sitting 12.9
time 11.9
measuring instrument 11.9
outdoor 11.5
wall 10.8
park 10.7
father 10.7
seller 10.5
instrument 10.4
brown 10.3
smiling 10.1
beach 10.1
ancient 9.5
dad 9.4
smile 9.3
attractive 9.1
stone 9
world 9
happy 8.8
mother 8.6
face 8.5
lady 8.1
brick 8.1
autumn 7.9
happiness 7.8
sea 7.8
portrait 7.8
fashion 7.5
work 7.4
teen 7.3
cute 7.2
art 7.2
hair 7.1
equipment 7.1
family 7.1
travel 7

Google
created on 2021-12-15

Microsoft
created on 2021-12-15

text 99.4
clothing 96.2
person 95.3
black and white 73.4
drawing 68.4
smile 64.4
human face 64
white goods 56.9
old 46.2

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 13-23
Gender Female, 94.5%
Sad 98.1%
Calm 1.7%
Confused 0.1%
Angry 0%
Surprised 0%
Fear 0%
Disgusted 0%
Happy 0%

Microsoft Cognitive Services

Age 32
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 97.5%
Chair 91.3%
Painting 63.7%