Human Generated Data

Title

Untitled (crowd looking at water pumping machinery)

Date

c. 1950, printed later

People

Artist: Jack Rodden Studio, American 1914 - 2016

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.13813

Human Generated Data

Title

Untitled (crowd looking at water pumping machinery)

People

Artist: Jack Rodden Studio, American 1914 - 2016

Date

c. 1950, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.13813

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Indoors 99.1
Interior Design 99.1
Human 98
Person 98
Person 97.6
Person 96.3
Person 93.2
Lighting 92.8
Person 88.1
Person 87.1
Person 86.1
Person 82.8
Person 78.4
Person 75.1
Building 74.8
Metropolis 74.8
Town 74.8
Urban 74.8
City 74.8
Person 73.9
Leisure Activities 65.7
Machine 65.4
Room 62.5
Motor 55.8

Clarifai
created on 2019-11-16

silhouette 96.7
people 96.2
illustration 95.3
square 95.2
vector 94.9
man 92.2
art 91.5
dark 91.3
shadow 90.9
design 90.5
image 89.1
desktop 88.2
no person 87.5
woman 84.8
graphic 82.8
old 78.7
picture frame 78.6
portrait 78.5
abstract 78.2
architecture 78

Imagga
created on 2019-11-16

equipment 23
chandelier 22.2
black 20.6
silhouette 17.4
electronic equipment 16.6
monitor 16.5
lighting fixture 15.5
business 15.2
light 14
sax 13.7
design 13.5
fixture 12.8
sky 12.7
old 12.5
window 12.5
night 12.4
symbol 12.1
grunge 11.9
art 11.8
city 11.6
dark 10.8
man 10.7
reflection 10.6
modern 10.5
skyline 10.4
lamp 10.4
shape 10.4
architecture 10.3
glass 10.2
backboard 10.1
device 9.7
technology 9.6
building 9.6
pattern 9.6
space 9.3
tower 9
sign 9
sunset 9
structure 8.8
metal 8
person 7.9
urban 7.9
bright 7.9
sea 7.8
male 7.8
port 7.7
downtown 7.7
retro 7.4
street 7.4
television 7.3
people 7.2
sun 7.2
decoration 7.2
dirty 7.2
antique 7.1
interior 7.1
drawing 7.1

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 12-22
Gender Female, 50.2%
Confused 49.5%
Angry 49.5%
Surprised 49.5%
Calm 49.6%
Disgusted 49.6%
Fear 49.5%
Sad 50.2%
Happy 49.6%

AWS Rekognition

Age 36-52
Gender Male, 50.3%
Calm 49.5%
Confused 49.5%
Fear 49.5%
Sad 49.5%
Angry 50.5%
Surprised 49.5%
Disgusted 49.5%
Happy 49.5%

Feature analysis

Amazon

Person 98%

Categories

Captions

Microsoft
created on 2019-11-16

a flat screen television 62.6%
a flat screen tv 62.5%
a flat screen television on the wall 56.4%