Human Generated Data

Title

Untitled (two photographs: man seated with framed paintings inside house; man standing with framed paintings inside house)

Date

c. 1950, printed later

People

Artist: Jack Rodden Studio, American 1914 - 2016

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.13829

Human Generated Data

Title

Untitled (two photographs: man seated with framed paintings inside house; man standing with framed paintings inside house)

People

Artist: Jack Rodden Studio, American 1914 - 2016

Date

c. 1950, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.13829

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Human 98.3
Person 98.3
Furniture 94.8
Bus 90.7
Transportation 90.7
Vehicle 90.7
Advertisement 72.3
Collage 72.3
Poster 72.3
Table 67.2
Screen 64.9
Electronics 64.9
Display 63.9
Monitor 63.9
Home Decor 63.5
LCD Screen 61.2
Vessel 58.3
Watercraft 58.3
Shelf 57.4
Indoors 57.3
Room 56.8

Clarifai
created on 2019-11-16

monochrome 99
room 98.5
people 96.9
furniture 95.8
indoors 95.5
no person 95.2
street 92.9
group 89
vehicle 87.4
adult 86
family 85.6
two 84.5
man 82.9
train 82.5
chair 81.4
one 80.3
inside 79.9
window 78.6
industry 78.4
group together 78.3

Imagga
created on 2019-11-16

pump 42.1
gas pump 41.8
pay-phone 38
telephone 33
electronic equipment 27.4
equipment 26.9
mechanical device 24.6
wall 21.4
device 19.5
house 19.2
old 18.1
mechanism 18
refrigerator 17.7
locker 16.1
architecture 15.6
window 15.1
metal 14.5
interior 14.1
building 14.1
white goods 14
fastener 13.9
door 13.3
case 13
home 12.8
home appliance 12.6
city 12.5
steel 12.4
technology 11.9
car 11.4
empty 11.2
safety 11
light 10.7
travel 10.6
windows 10.6
sign 10.5
restraint 10.4
industry 10.2
box 10.1
computer 9.8
vehicle 9.7
station 9.7
machine 9.7
gas 9.6
black 9.6
glass 9.3
appliance 9.3
modern 9.1
transportation 9
urban 8.7
storage 8.6
brick 8.5
power 8.4
room 8.3
street 8.3
security 8.3
digital 8.1
gasoline 7.8
display 7.8
fuel 7.7
energy 7.6
safe 7.5
iron 7.5
vintage 7.4
town 7.4
service 7.4
business 7.3
detail 7.2
open 7.2
information 7.1
work 7.1
sky 7

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

text 99.7
indoor 96.8
vehicle 86.3
black and white 81.1
land vehicle 79.5
bus 71.8
white 60.3
kitchen appliance 14.1

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 43-61
Gender Male, 54.2%
Confused 45%
Surprised 45.1%
Sad 45.1%
Calm 54.6%
Disgusted 45%
Happy 45%
Fear 45%
Angry 45.1%

AWS Rekognition

Age 51-69
Gender Female, 50.2%
Calm 49.5%
Sad 49.5%
Fear 49.5%
Confused 49.5%
Angry 49.5%
Happy 50.5%
Disgusted 49.5%
Surprised 49.5%

Feature analysis

Amazon

Person 98.3%
Bus 90.7%

Categories

Imagga

interior objects 99.7%

Text analysis

Amazon

New
New Mexico
Mexico

Google

New Mexice
New
Mexice