Human Generated Data

Title

Untitled (two dogs pulling a bear in a cart)

Date

c. 1945

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12199

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (two dogs pulling a bear in a cart)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1945

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12199

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Human 91.3
Person 91.3
Canine 84.6
Pet 84.6
Animal 84.6
Mammal 84.6
Dog 84.6
Transportation 81.1
Vehicle 81.1
Horse 79.1
Person 74.4
Text 63.2
Art 62.6
People 59.6
Clothing 57.2
Apparel 57.2

Clarifai
created on 2019-11-16

people 99.2
group together 96.2
group 95.8
man 93.8
fence 92.3
dog 92
many 90.3
canine 90.2
adult 88.9
cage 88.7
military 88.7
child 87.3
offense 86.7
print 86.4
buttocks 85.2
war 83.2
monochrome 83
wear 82.8
vehicle 82.3
security 80.6

Imagga
created on 2019-11-16

sketch 61.2
drawing 51.4
snow 42
representation 31
old 20.9
vintage 20.7
grunge 20.4
tricycle 20.4
weather 18.8
wheeled vehicle 18.1
retro 18
frame 15.8
vehicle 15.7
black 15.6
design 15.2
antique 13.8
texture 13.2
aged 11.8
paper 11.8
decoration 11.6
pattern 11.6
wall 11.5
art 11.2
dirty 10.8
ancient 10.4
winter 10.2
window 10.1
man 10.1
house 10
paint 10
newspaper 9.9
structure 9.9
empty 9.4
poster 9.4
blank 9.4
screen 9.3
decorative 9.2
travel 9.1
border 9
transportation 9
style 8.9
conveyance 8.9
interior 8.8
graphic 8.8
cold 8.6
plan 8.5
card 8.5
fence 8.2
park 8.2
rough 8.2
road 8.1
home 8
urban 7.9
day 7.8
dirt 7.6
product 7.6
wood 7.5
city 7.5
sport 7.4
transport 7.3
material 7.1
modern 7

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

person 80.6
drawing 79.6
text 78.1
clothing 72.3
old 41.6

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 32-48
Gender Male, 52%
Happy 45.1%
Disgusted 45.6%
Angry 45.1%
Fear 53.6%
Calm 45.1%
Surprised 45.2%
Sad 45.1%
Confused 45.1%

AWS Rekognition

Age 21-33
Gender Male, 50.4%
Happy 49.5%
Sad 50.4%
Disgusted 49.5%
Surprised 49.5%
Fear 49.5%
Angry 49.5%
Confused 49.5%
Calm 49.6%

Feature analysis

Amazon

Person 91.3%
Dog 84.6%
Horse 79.1%

Categories

Imagga

paintings art 94.5%
interior objects 3.9%