Human Generated Data

Title

Untitled (two photographs: two nuns standing together in interior)

Date

c. 1935, printed later

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6077

Human Generated Data

Title

Untitled (two photographs: two nuns standing together in interior)

People

Artist: Durette Studio, American 20th century

Date

c. 1935, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6077

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Clothing 85.5
Apparel 85.5
Human 78.6
Person 78.6
Person 77.3
Shop 71.6
Mannequin 64.6
Room 61.1
Indoors 61.1
Window Display 58.4
Furniture 58.2
Person 54.6

Clarifai
created on 2019-11-16

people 98.9
room 97.6
wear 96.7
mirror 96.1
adult 95.6
woman 94.3
group 92.4
indoors 91.9
model 91.1
wedding 90
window 89.5
home 89.4
dress 89.2
girl 88.7
man 88.6
veil 88.6
furniture 87.4
portrait 86.2
fashion 85.4
ghost 84.1

Imagga
created on 2019-11-16

furniture 35.9
punching bag 31.9
wardrobe 22.8
bag 22
bottle 20.8
furnishing 20.5
game equipment 19.1
interior 18.5
equipment 17.1
container 15.3
urban 14
modern 13.3
architecture 13.3
glass 13.2
black 13.1
inside 12.9
window 12.3
boutique 12.2
case 12.1
robe 11.9
building 11.9
people 11.7
wood 11.7
business 11.5
fashion 11.3
clothing 10.6
wine bottle 10.6
travel 10.5
wine 10.2
nobody 10.1
light 10
old 9.7
table 9.5
drink 9.2
shopping 9.2
room 8.9
object 8.8
wooden 8.8
women 8.7
life 8.6
garment 8.5
buffet 8.4
design 8.4
alcohol 8.3
city 8.3
group 8
metal 8
home 8
vessel 7.9
luxury 7.7
wall 7.7
chair 7.6
elegance 7.5
cabinet 7.5
floor 7.4
close 7.4
style 7.4
man 7.4
reflection 7.3
indoor 7.3
indoors 7

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

black and white 93.5
black 71.8
white 70.1
text 59.4
clothing 57.1
gallery 55.3
room 44.7
furniture 25.6
several 11.2

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 42-60
Gender Male, 52.2%
Happy 45.2%
Angry 46.2%
Disgusted 45.3%
Confused 45.3%
Sad 50.2%
Calm 47%
Surprised 45.2%
Fear 45.5%

AWS Rekognition

Age 36-54
Gender Female, 54.2%
Angry 48.5%
Surprised 45%
Sad 48.5%
Happy 45%
Calm 47%
Fear 45%
Confused 45.9%
Disgusted 45%

AWS Rekognition

Age 44-62
Gender Male, 52.6%
Happy 45.5%
Sad 48.4%
Disgusted 45.1%
Fear 45.2%
Calm 50.1%
Angry 45.3%
Confused 45.2%
Surprised 45.2%

AWS Rekognition

Age 26-40
Gender Male, 50.1%
Sad 49.7%
Calm 49.7%
Angry 49.6%
Fear 49.7%
Happy 49.6%
Surprised 49.6%
Disgusted 49.5%
Confused 49.5%

AWS Rekognition

Age 31-47
Gender Female, 54.2%
Calm 53.5%
Fear 45%
Confused 45.1%
Happy 45%
Angry 46%
Disgusted 45%
Sad 45.4%
Surprised 45%

AWS Rekognition

Age 27-43
Gender Female, 50%
Sad 49.7%
Surprised 49.6%
Angry 49.6%
Calm 49.9%
Happy 49.6%
Fear 49.6%
Disgusted 49.5%
Confused 49.5%

Feature analysis

Amazon

Person 78.6%

Categories

Imagga

interior objects 94.8%
paintings art 3.9%