Human Generated Data

Title

Untitled (pastor standing on outdoor podium)

Date

c. 1940

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4372

Human Generated Data

Title

Untitled (pastor standing on outdoor podium)

People

Artist: Durette Studio, American 20th century

Date

c. 1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4372

Machine Generated Data

Tags

Amazon
created on 2019-06-01

Clothing 99.7
Apparel 99.7
Person 97.3
Human 97.3
Person 95.5
Person 93.9
Food 93.3
Icing 93.3
Creme 93.3
Dessert 93.3
Cake 93.3
Cream 93.3
Person 93.1
Furniture 79.9
Hat 77.5
Dress 76.6
Face 74.9
Female 72.7
People 71
Chair 67.8
Person 60.7
Portrait 58
Photo 58
Photography 58
Outdoors 56.6
Girl 56.4
Coat 55.8
Overcoat 55.8
Suit 55.8

Clarifai
created on 2019-06-01

people 99.7
adult 96.4
man 95.8
wear 91.6
woman 91.5
group 91.2
veil 88.2
two 87.7
one 85.3
group together 82.4
child 82.2
outfit 80.8
furniture 79.7
many 79.2
wedding 77.3
ceremony 77.2
room 74.5
leader 73.3
sit 73.1
portrait 70.1

Imagga
created on 2019-06-01

negative 78.6
film 63.8
photographic paper 47.8
photographic equipment 31.9
architecture 26.5
history 23.2
fountain 22.9
statue 20.7
building 19.9
travel 19
tourism 17.3
sculpture 17.2
city 16.6
religion 16.1
snow 15.7
marble 15.5
old 15.3
temple 15.1
ancient 14.7
landmark 14.4
art 13.7
historical 13.2
structure 13
column 12.3
culture 12
people 11.7
god 11.5
water 11.3
monument 11.2
winter 11.1
park 10.7
landscape 10.4
sky 10.2
church 10.2
house 10
tourist 10
book jacket 9.9
famous 9.3
historic 9.2
summer 9
stone 8.5
tree 8.5
jacket 7.7
pool 7.7
religious 7.5
symbol 7.4
vacation 7.4
wedding 7.4
peace 7.3
season 7

Google
created on 2019-06-01

Microsoft
created on 2019-06-01

black and white 85.1
person 79.2
clothing 76.6
old 65.7
statue 57.9

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 17-27
Gender Male, 51.2%
Sad 45.8%
Angry 45.5%
Disgusted 45.9%
Surprised 45.6%
Happy 45.5%
Calm 51.2%
Confused 45.5%

AWS Rekognition

Age 20-38
Gender Male, 50.9%
Happy 46%
Calm 50.9%
Angry 45.4%
Surprised 45.4%
Disgusted 45.9%
Confused 45.2%
Sad 46.2%

AWS Rekognition

Age 35-52
Gender Female, 52.8%
Surprised 45.3%
Sad 48.7%
Angry 45.4%
Disgusted 45.1%
Calm 46.2%
Happy 48.8%
Confused 45.4%

AWS Rekognition

Age 35-53
Gender Female, 54.5%
Sad 46.1%
Angry 46%
Disgusted 45.5%
Surprised 45.9%
Happy 46%
Calm 49.5%
Confused 45.9%

AWS Rekognition

Age 35-52
Gender Female, 52.3%
Disgusted 46.8%
Sad 46.6%
Happy 45.3%
Surprised 45.7%
Angry 45.6%
Calm 49.3%
Confused 45.7%

AWS Rekognition

Age 23-38
Gender Female, 54%
Confused 45.3%
Disgusted 45.5%
Calm 46.3%
Angry 45.4%
Sad 51.8%
Happy 45.3%
Surprised 45.3%

Feature analysis

Amazon

Person 97.3%

Categories

Imagga

paintings art 98.3%

Captions

Microsoft
created on 2019-06-01

a vintage photo of a person 85.2%
an old photo of a person 83.4%
a vintage photo of a person 78.3%