Human Generated Data

Title

Untitled (studio portrait of two nuns standing in front of painted backdrop)

Date

c. 1905-1915, printed later

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6029

Human Generated Data

Title

Untitled (studio portrait of two nuns standing in front of painted backdrop)

People

Artist: Durette Studio, American 20th century

Date

c. 1905-1915, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6029

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Human 98.3
Person 97.4
Plant 95.9
Blossom 93.7
Flower Arrangement 93.7
Flower 93.7
Flower Bouquet 93.7
Interior Design 90.7
Indoors 90.7
Room 79.2
Bedroom 68.5
Furniture 63.4
Bed 62.6
Photo 57.6
Photography 57.6

Clarifai
created on 2019-11-16

people 99.8
monochrome 98.6
adult 98
one 97.6
portrait 97.2
man 96.7
television 96.6
woman 95.6
furniture 93.2
actress 92.3
two 92.2
indoors 91.9
window 91.4
room 90
family 88.7
music 87.5
facial expression 87.3
actor 86.7
group 86.4
analogue 85.3

Imagga
created on 2019-11-16

television 31.6
adult 22.6
person 21.3
car 19.9
interior 19.4
portrait 18.7
room 18.6
furniture 17.5
home 16.7
telecommunication system 16.5
people 16.2
sitting 15.4
attractive 15.4
bride 15.3
sexy 14.4
happy 14.4
face 14.2
house 13.4
looking 12.8
man 12.8
black 12.7
dress 12.6
indoors 12.3
window 12.2
door 12.1
wedding 11.9
one 11.9
love 11.8
happiness 11.7
device 11.7
vehicle 11.6
male 11.3
electronic equipment 11.3
fashion 11.3
modern 11.2
indoor 10.9
smiling 10.8
lifestyle 10.8
smile 10.7
posing 10.7
couple 10.4
chair 10.1
bedroom 9.6
women 9.5
bouquet 9.4
light 9.3
model 9.3
bookcase 9.2
elegance 9.2
relaxation 9.2
pretty 9.1
cheerful 8.9
lady 8.9
driver 8.7
hair 8.7
blond 8.7
automobile 8.6
cute 8.6
men 8.6
luxury 8.6
old 8.4
transport 8.2
equipment 8.2
gorgeous 8.1
transportation 8.1
broadcasting 7.9
brunette 7.8
skin 7.6
human 7.5
floor 7.4
sensuality 7.3
celebration 7.2
worker 7.1
travel 7
furnishing 7

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

indoor 95.7
black and white 93.5
text 92
window 84.2
flower 70.1
person 61.3
human face 53.1
furniture 51

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 3-11
Gender Male, 87.4%
Calm 0.2%
Sad 90.1%
Angry 0.1%
Disgusted 0.7%
Happy 0%
Surprised 0.1%
Fear 2.5%
Confused 6.3%

AWS Rekognition

Age 22-34
Gender Male, 52.4%
Happy 52.8%
Fear 45.1%
Angry 45.2%
Confused 45.1%
Calm 46%
Disgusted 45.2%
Surprised 45.5%
Sad 45.2%

AWS Rekognition

Age 31-47
Gender Male, 53.8%
Happy 52.7%
Angry 45.1%
Confused 45.1%
Calm 46.4%
Disgusted 45.1%
Fear 45.2%
Surprised 45.1%
Sad 45.3%

Microsoft Cognitive Services

Age 7
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 97.4%

Categories

Imagga

interior objects 71.1%
paintings art 24.2%
pets animals 2.9%

Text analysis

Google

AS
AS