Human Generated Data

Title

Untitled (outdoor religious celebration)

Date

c. 1939

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5700

Human Generated Data

Title

Untitled (outdoor religious celebration)

People

Artist: Durette Studio, American 20th century

Date

c. 1939

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5700

Machine Generated Data

Tags

Amazon
created on 2019-06-01

Person 99.5
Human 99.5
Person 99.2
Person 98.5
Nature 95.9
Person 94.1
Outdoors 91.3
Clothing 83.1
Apparel 83.1
Snow 74.2
Person 69.5
Weather 68.6
Face 67.9
Pedestrian 65.9
People 63.4
Rural 59.5
Countryside 59.5
Shelter 59.5
Building 59.5
Photo 58.9
Photography 58.9
Winter 58.2
Art 57.7
Coat 56.9
Overcoat 56.9
Ice 55.3

Clarifai
created on 2019-06-01

people 99.7
adult 98.9
man 97.4
woman 96.3
group 94.5
group together 93.2
two 88.3
street 86.1
monochrome 85.3
wear 85.2
child 83.9
family 83.4
winter 76.8
one 76
home 72.9
vehicle 71.6
snow 71.3
many 70.5
several 70.5
veil 69.2

Imagga
created on 2019-06-01

negative 40.2
film 34.1
snow 23
person 23
photographic paper 22.7
people 20.1
work 18.8
business 18.8
adult 17.2
winter 17
businessman 16.8
man 16.5
male 15.4
photographic equipment 15.1
job 15
building 14.8
grunge 14.5
manager 13
cold 12.9
color 12.8
professional 12.6
vintage 12.4
office 12
looking 12
casual 11.9
computer 11.2
men 11.2
old 11.1
architecture 10.9
worker 10.8
weather 10.6
black 10.2
drawing 9.8
group 9.7
indoors 9.7
frost 9.6
grungy 9.5
space 9.3
ice 9.2
tree 9.2
clothing 9
design 9
cheerful 8.9
forest 8.7
sitting 8.6
construction 8.6
businesspeople 8.5
portrait 8.4
house 8.4
human 8.2
park 8.2
indoor 8.2
one 8.2
retro 8.2
landscape 8.2
technology 8.2
paint 8.1
child 8.1
success 8
trees 8
smile 7.8
face 7.8
season 7.8
antique 7.8
scene 7.8
snowy 7.8
engineer 7.7
outdoor 7.6
texture 7.6
room 7.5
happy 7.5
frame 7.5
city 7.5
occupation 7.3
20s 7.3
successful 7.3
rough 7.3
dirty 7.2
suit 7.2
history 7.2
handsome 7.1
women 7.1
medical 7.1
happiness 7
scenic 7

Google
created on 2019-06-01

Microsoft
created on 2019-06-01

person 97.8
man 95.6
house 91.1
outdoor 88.3
black and white 74.1
posing 71.9

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 19-36
Gender Female, 52.4%
Disgusted 45%
Sad 46.3%
Happy 45.1%
Surprised 45.4%
Angry 45.2%
Calm 52.5%
Confused 45.6%

AWS Rekognition

Age 26-43
Gender Female, 50.2%
Surprised 49.6%
Confused 49.6%
Disgusted 49.7%
Happy 49.6%
Sad 49.9%
Calm 49.5%
Angry 49.7%

AWS Rekognition

Age 17-27
Gender Male, 50.1%
Sad 49.5%
Angry 49.5%
Happy 49.5%
Confused 49.5%
Calm 49.5%
Surprised 49.5%
Disgusted 50.5%

AWS Rekognition

Age 14-25
Gender Female, 50.4%
Confused 49.5%
Disgusted 49.6%
Calm 50.1%
Angry 49.6%
Sad 49.6%
Happy 49.5%
Surprised 49.6%

Feature analysis

Amazon

Person 99.5%

Categories

Imagga

paintings art 97.1%
interior objects 2.7%