Human Generated Data

Title

Untitled (Arkansas)

Date

October 1935, printed later

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Museum Acquisition, P1970.3454

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Arkansas)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Museum Acquisition, P1970.3454

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-05

Child 99.4
Female 99.4
Girl 99.4
Person 99.4
Child 98.9
Person 98.9
Boy 98.9
Male 98.9
Face 95.4
Head 95.4
Clothing 85.8
Footwear 85.8
Shoe 85.8
Person 85.6
Baby 85.6
Tin 84.5
Photography 84.2
Portrait 84.2
Shoe 78.9
Outdoors 77.5
Can 67.8
Wood 57.7
Body Part 57.7
Finger 57.7
Hand 57.7
Shorts 57.4
Slum 56.8
Dress 56.1
City 55.6
Road 55.6
Street 55.6
Urban 55.6

Clarifai
created on 2018-05-10

people 100
child 98.8
one 98.2
adult 97.9
two 97.6
wear 96.6
group 96.6
portrait 93.8
three 93.5
woman 91.9
man 90.4
several 87.9
boy 87.2
four 86.7
group together 86.5
five 81.3
veil 80.9
education 80.5
facial expression 80.5
administration 79.9

Imagga
created on 2023-10-05

old 20.9
ancient 14.7
person 12.9
sculpture 12.6
art 12.4
portrait 12.3
people 12.3
architecture 11.7
house 11.7
vintage 11.6
building 11.4
dirty 10.8
city 10.8
man 10.8
history 10.7
world 10.6
wall 10.4
culture 10.3
street 10.1
black 9.2
travel 9.2
door 9.1
child 9.1
adult 9.1
statue 8.9
death 8.7
love 8.7
face 8.5
grunge 8.5
head 8.4
historic 8.2
makeup 8.2
dress 8.1
religion 8.1
male 8
stone 7.8
attractive 7.7
fashion 7.5
dark 7.5
human 7.5
religious 7.5
water 7.3
alone 7.3
home 7.2
holiday 7.2
window 7.1
look 7

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

white 75.3
old 56.1
vintage 32.6
picture frame 7.6

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 11-19
Gender Female, 100%
Calm 99%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Confused 0.1%
Happy 0.1%
Angry 0.1%
Disgusted 0%

AWS Rekognition

Age 2-8
Gender Male, 100%
Confused 58.9%
Calm 37.3%
Surprised 6.7%
Fear 6.1%
Sad 2.5%
Happy 0.9%
Disgusted 0.2%
Angry 0.2%

Microsoft Cognitive Services

Age 8
Gender Female

Microsoft Cognitive Services

Age 7
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Child 99.4%
Female 99.4%
Girl 99.4%
Person 99.4%
Boy 98.9%
Male 98.9%
Shoe 85.8%
Baby 85.6%

Categories

Imagga

paintings art 98%

Captions

Text analysis

Amazon

RUP