Human Generated Data

Title

Scene in front of church, Little Rock, Arkansas

Date

1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.3051

Human Generated Data

Title

Scene in front of church, Little Rock, Arkansas

People

Artist: Ben Shahn, American 1898 - 1969

Date

1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.3051

Machine Generated Data

Tags

Amazon
created on 2021-12-15

Person 99.7
Human 99.7
Person 99.7
Skin 99.4
Clothing 98.2
Apparel 98.2
Sitting 87.9
Face 81
Sleeve 73.3
Hat 64.8
Cap 60.1
Wall 58.2
Man 57.9

Clarifai
created on 2023-10-15

people 99.9
portrait 99.8
two 99.6
child 99.6
three 98.7
family 98
boy 97.5
four 97.5
adult 97.5
son 97.3
man 97.1
offspring 95.6
woman 94.9
retro 93.3
group 91
old 90.4
wear 89.6
sibling 88.1
vintage 87.9
girl 87.9

Imagga
created on 2021-12-15

person 30.8
man 25
people 20.1
male 19.8
adult 19.4
seller 19.1
building 18.1
city 17.5
urban 15.7
newspaper 15.6
portrait 14.9
outdoor 14.5
black 14.4
business 14
wall 13.9
scholar 13.6
world 13.5
street 12.9
outdoors 12.8
model 12.4
fashion 12.1
product 12
child 11.9
alone 11.9
intellectual 11.8
lifestyle 11.6
men 11.2
juvenile 10.7
businessman 10.6
looking 10.4
youth 10.2
happy 10
silhouette 9.9
hair 9.5
walking 9.5
creation 9.4
handsome 8.9
sad 8.7
lonely 8.7
architecture 8.6
corporate 8.6
outside 8.6
casual 8.5
modern 8.4
pretty 8.4
attractive 8.4
old 8.4
teenager 8.2
industrial 8.2
posing 8
percussion instrument 7.9
boy 7.8
standing 7.8
face 7.8
musical instrument 7.8
travel 7.7
sitting 7.7
construction 7.7
walk 7.6
one 7.5
life 7.4
teen 7.4
window 7.3
love 7.1
job 7.1
work 7.1

Google
created on 2021-12-15

Microsoft
created on 2021-12-15

clothing 95.6
outdoor 95.3
human face 94.1
sitting 94
text 93.2
black and white 91.6
person 87.1
woman 80.6
monochrome 73.7
street 61.9

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 5-15
Gender Female, 74.7%
Sad 59.7%
Angry 39%
Confused 0.6%
Calm 0.4%
Fear 0.1%
Disgusted 0.1%
Surprised 0%
Happy 0%

AWS Rekognition

Age 22-34
Gender Male, 64.1%
Fear 61.3%
Sad 21.4%
Calm 12.5%
Surprised 3.3%
Angry 0.8%
Confused 0.4%
Happy 0.2%
Disgusted 0.1%

Microsoft Cognitive Services

Age 9
Gender Female

Microsoft Cognitive Services

Age 30
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%

Categories

Imagga

paintings art 97.7%
text visuals 1.7%