Human Generated Data

Title

Untitled (cast of pilgrim and indian performers on platform of altar of church)

Date

1939

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4140

Human Generated Data

Title

Untitled (cast of pilgrim and indian performers on platform of altar of church)

People

Artist: Durette Studio, American 20th century

Date

1939

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4140

Machine Generated Data

Tags

Amazon
created on 2019-06-01

Interior Design 98.5
Indoors 98.5
Person 98.4
Human 98.4
Person 95.4
Person 95.4
Person 94.5
Person 93.6
Person 92.2
People 90.6
Person 88.6
Person 85.5
Person 83.8
Building 79.3
Architecture 78.5
Crowd 73.3
Room 73.2
Church 59.2
Stage 57.9
Furniture 57.6
Altar 56.2
Audience 55.3

Clarifai
created on 2019-06-01

people 99.3
adult 97.1
furniture 96.8
man 95.8
group 95.8
indoors 95.5
room 95.4
illustration 95
woman 93.4
table 91.7
administration 91.5
chair 89.8
art 87.6
decoration 85.1
inside 84.2
family 84.1
leader 84
home 83.7
print 83.6
ceremony 82.3

Imagga
created on 2019-06-01

altar 73
structure 57.1
architecture 35.1
glass 32.1
building 26.2
religion 23.3
landmark 22.6
church 21.3
old 20.9
history 20.6
city 19.9
art 17.8
monument 17.7
god 17.2
fountain 16.9
famous 16.7
cathedral 16.3
marble 16.2
house 15.9
sculpture 15.8
ancient 15.6
window 15.6
sketch 15.5
culture 15.4
historic 14.7
drawing 13.9
travel 13.4
historical 13.2
column 12.8
baroque 12.7
water 12.7
tourism 12.4
decoration 12.3
stone 11.8
balcony 11.5
interior 11.5
religious 11.2
tourist 10.9
statue 10.6
exterior 10.1
holy 9.6
faith 9.6
people 9.5
symbol 9.4
place 9.3
ornate 9.1
representation 9.1
facade 9.1
gold 9
home 8.8
arch 8.7
light 8.7
table 8.7
attraction 8.6
case 8.5
design 8.4
traditional 8.3
urban 7.9
flowers 7.8
education 7.8
scene 7.8
catholic 7.8
dome 7.7
chair 7.6
inside 7.4
antique 7.3
indoors 7

Google
created on 2019-06-01

Photograph 96.6
White 96.3
Room 74.4
Child 57.3
Black-and-white 56.4
Family 53.7
Tableware 50.4

Microsoft
created on 2019-06-01

white 65.8

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 26-43
Gender Female, 52.7%
Angry 45.3%
Surprised 45.5%
Disgusted 45.5%
Sad 45.8%
Calm 50.8%
Happy 45.4%
Confused 46.7%

AWS Rekognition

Age 20-38
Gender Female, 50%
Surprised 49.6%
Calm 49.9%
Disgusted 49.5%
Confused 49.6%
Sad 49.9%
Happy 49.5%
Angry 49.5%

AWS Rekognition

Age 26-43
Gender Female, 53.2%
Happy 45.3%
Calm 45.9%
Angry 45.4%
Confused 45.2%
Surprised 45.2%
Sad 52.6%
Disgusted 45.4%

AWS Rekognition

Age 17-27
Gender Male, 50.1%
Confused 49.5%
Surprised 49.6%
Calm 50.1%
Sad 49.5%
Happy 49.6%
Disgusted 49.6%
Angry 49.6%

AWS Rekognition

Age 38-59
Gender Female, 50.4%
Confused 49.5%
Surprised 49.5%
Sad 49.8%
Angry 49.6%
Happy 49.7%
Calm 49.6%
Disgusted 49.8%

AWS Rekognition

Age 38-59
Gender Male, 50.3%
Confused 49.7%
Disgusted 49.5%
Happy 49.7%
Surprised 49.6%
Calm 49.8%
Sad 49.6%
Angry 49.6%

AWS Rekognition

Age 35-52
Gender Female, 52.3%
Angry 45.6%
Confused 45.3%
Surprised 45.4%
Happy 46.4%
Sad 47.2%
Calm 49.6%
Disgusted 45.4%

AWS Rekognition

Age 11-18
Gender Female, 50.4%
Happy 49.7%
Confused 49.5%
Angry 49.6%
Sad 49.7%
Calm 49.9%
Surprised 49.5%
Disgusted 49.5%

Feature analysis

Amazon

Person 98.4%

Categories

Imagga

interior objects 99.9%