Unable to open [object Object]: HTTP 0 attempting to load TileSource

Human Generated Data

Title

Untitled (portrait of a family standing on grass and dirt in front of fence, baby on rocking chair)

Date

c. 1935, printed later

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6088

Human Generated Data

Title

Untitled (portrait of a family standing on grass and dirt in front of fence, baby on rocking chair)

People

Artist: Durette Studio, American 20th century

Date

c. 1935, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6088

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Person 99.7
Human 99.7
Person 99.6
Person 99.5
Person 99.3
Person 99.3
Person 99.3
Person 99.2
Person 99.1
Person 98.3
Person 97.9
Person 97
Person 94.6
Person 93.5
Person 82.6
Person 81.5
Person 81
People 78.8
Person 75.9
Apparel 74.7
Clothing 74.7
Person 71
Advertisement 68.2
Overcoat 65.5
Suit 65.5
Coat 65.5
Face 64
Poster 63.2
Text 57.9
Furniture 56.6

Clarifai
created on 2019-11-16

people 100
group 99.8
adult 99.3
group together 99.1
man 98.1
many 97.7
child 97.6
woman 96.2
several 95.2
wear 94.9
administration 94.8
military 94.4
war 94.4
vehicle 92.2
recreation 91.2
soldier 91
leader 90.8
outfit 89.9
four 88.8
boy 87.3

Imagga
created on 2019-11-16

barbershop 24.4
shop 22
window 20.2
snow 19.5
old 19.5
building 19
mercantile establishment 16.7
dirty 16.3
grunge 16.2
wall 15.4
black 13.8
architecture 13.3
urban 13.1
man 12.8
city 12.5
structure 11.8
paint 11.8
people 11.7
house 11.7
vintage 11.6
travel 11.3
musical instrument 11.2
place of business 11.1
street 11
pattern 10.9
door 10.9
texture 10.4
design 10.1
weather 10
light 10
history 9.8
art 9.8
outdoors 9.7
winter 9.4
glass 9.3
male 9.3
silhouette 9.1
telephone 8.8
antique 8.6
scene 8.6
call 8.5
poster 8.5
person 8.5
office 8.4
sky 8.3
equipment 8.1
business 7.9
decoration 7.8
cold 7.7
weathered 7.6
fence 7.5
dark 7.5
frame 7.5
landscape 7.4
wind instrument 7.4
tourist 7.4
exterior 7.4
retro 7.4
water 7.3
alone 7.3
home 7.2
working 7.1

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

clothing 98.6
person 98.2
text 97
man 94.7
gallery 92.7
room 76.2
posing 75.5
photograph 72.8
group 63.9
old 54.4

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 44-62
Gender Male, 54.5%
Angry 45.1%
Sad 46%
Confused 45.4%
Calm 52.9%
Surprised 45%
Happy 45.1%
Disgusted 45.4%
Fear 45%

Feature analysis

Amazon

Person
Person 99.7%

Categories