Human Generated Data

Title

Untitled (clergyman speaking on stadium stage with two others seated behind him at Catholic event)

Date

1955-1960

People

Artist: Claseman Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11015

Human Generated Data

Title

Untitled (clergyman speaking on stadium stage with two others seated behind him at Catholic event)

People

Artist: Claseman Studio, American 20th century

Date

1955-1960

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-03-25

Person 99.5
Human 99.5
Person 98.4
Person 94
Apparel 92.8
Clothing 92.8
People 67.4
Photo 62
Face 62
Portrait 62
Photography 62
Furniture 61.2
Stage 59

Clarifai
created on 2019-03-25

people 99.3
monochrome 98.2
man 95.3
light 92.7
adult 92.2
wedding 90.8
water 89.3
one 88.9
woman 88.7
sea 88.5
no person 87.8
ocean 87
winter 83.6
architecture 81.9
indoors 80.8
group 79.7
watercraft 79.1
chair 77.4
dark 77.4
illustration 77.3

Imagga
created on 2019-03-25

sky 19.9
architecture 19.9
tower 18.3
chandelier 17.4
equipment 16
building 15.9
ferris wheel 14.5
ride 14
construction 13.7
structure 13.5
dishwasher 13.4
lighting fixture 13.1
high 13
furniture 12.2
device 11.4
urban 11.4
industry 11.1
house 10.9
city 10.8
wall 10.5
old 10.4
technology 10.4
white goods 10.1
park 10
water 10
fixture 10
antenna 9.9
steel 9.7
metal 9.6
rotating mechanism 9.6
outdoor 9.2
travel 9.1
light 8.7
sea 8.6
furnishing 8.6
design 8.4
shopping 8.3
industrial 8.2
landmark 8.1
basket 8
home 8
black 7.8
summer 7.7
home appliance 7.6
wheel 7.5
lights 7.4
digital 7.3
business 7.3
mechanism 7.2
mechanical device 7.2
modern 7

Google
created on 2019-03-25

Microsoft
created on 2019-03-25

black 78.5
boat 78.5
black and white 60
lighthouse 48.8
monochrome 38.4
ship 35.5

Face analysis

Amazon

AWS Rekognition

Age 20-38
Gender Male, 50.3%
Surprised 49.6%
Sad 49.8%
Disgusted 49.6%
Happy 49.6%
Confused 49.5%
Calm 49.8%
Angry 49.6%

AWS Rekognition

Age 26-43
Gender Female, 50.1%
Happy 49.6%
Confused 49.6%
Disgusted 49.6%
Angry 49.7%
Calm 49.8%
Surprised 49.6%
Sad 49.7%

AWS Rekognition

Age 35-52
Gender Female, 50.4%
Disgusted 49.6%
Surprised 49.5%
Sad 50.1%
Confused 49.5%
Angry 49.6%
Happy 49.5%
Calm 49.7%

Feature analysis

Amazon

Person 99.5%

Captions

Microsoft

a black and white photo of a person 60.7%
a black and white photo of a person 59.4%
black and white photo of a person 53.6%

Text analysis

Amazon

SAFETY
KODAK
KODAK SAFETY FILM
FILM
10

Google

FILM
KODAK
KODAK SAFETY FILM 10
SAFETY
10