Human Generated Data

Title

Untitled (woman in coat holding purse, clock on wall behind her)

Date

c. 1950, printed later

People

Artist: Jack Rodden Studio, American 1914 - 2016

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.13831

Human Generated Data

Title

Untitled (woman in coat holding purse, clock on wall behind her)

People

Artist: Jack Rodden Studio, American 1914 - 2016

Date

c. 1950, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.13831

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Advertisement 99.2
Collage 99.2
Poster 99.2
Human 95.4
Person 95.4
Transportation 93.5
Car 93.5
Vehicle 93.5
Automobile 93.5
Person 79.6
Watercraft 73.8
Vessel 73.8
Apparel 72.3
Clothing 72.3
Person 59.9
Table 58.9
Furniture 58.9
Person 56.1

Clarifai
created on 2019-11-16

people 99.6
monochrome 99.1
street 98.7
adult 97.5
man 97.5
one 97.4
vehicle 95.5
two 94.4
group 91.4
city 90.5
woman 90.1
outdoors 89.3
no person 88.3
portrait 87.3
transportation system 86.4
action 86
group together 85.5
music 84.7
war 84.3
silhouette 82.6

Imagga
created on 2019-11-16

monitor 45.4
electronic equipment 34.1
billboard 33.4
signboard 27.1
sky 24.2
equipment 23.3
structure 22.1
water 22
landscape 19.3
clouds 17.7
sea 17.2
travel 16.9
sunset 15.3
boat 14.9
lake 14.6
cloud 14.6
beach 14.3
ocean 14.1
light 12.7
coast 12.6
night 12.4
vacation 12.3
outdoor 12.2
scene 12.1
sun 12.1
river 11.6
business 11.5
dusk 11.4
scenic 11.4
reflection 11.4
sunrise 11.2
device 11.2
peaceful 11
calm 11
city 10.8
horizon 10.8
tourism 10.7
home appliance 10.7
coastline 10.3
furniture 10.3
summer 10.3
evening 10.3
relaxation 10
desk 10
silhouette 9.9
outdoors 9.7
people 9.5
white goods 9.5
bay 9.4
architecture 9.4
holiday 9.3
tranquil 9
table 9
black 9
transportation 9
urban 8.7
man 8.7
port 8.7
peace 8.2
computer 8.2
scenery 8.1
dishwasher 8
sand 7.9
office 7.8
harbor 7.7
shore 7.4
natural 7.4
appliance 7.2
person 7.2
mountain 7.1
working 7.1
work 7.1

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

text 96.3
ship 91.1
black and white 89.5
vehicle 88.5
watercraft 50.6

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 12-22
Gender Female, 54.9%
Fear 45%
Happy 54.8%
Calm 45%
Sad 45%
Angry 45%
Surprised 45%
Disgusted 45%
Confused 45%

AWS Rekognition

Age 23-35
Gender Male, 50.2%
Angry 49.5%
Disgusted 49.5%
Happy 49.6%
Calm 50.3%
Sad 49.6%
Surprised 49.5%
Fear 49.5%
Confused 49.5%

Feature analysis

Amazon

Person 95.4%
Car 93.5%

Categories

Captions

Microsoft
created on 2019-11-16

a black and white photo of a kitchen 34.5%
an old photo of a person 31.2%

Text analysis

Google

MILITARY INSTITUTE NEW MEY
MILITARY
INSTITUTE
NEW
MEY