Human Generated Data

Title

Untitled (stylized portrait of woman in black dress with decorative sash sitting on stairs)

Date

1920-1940, printed later

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10267

Human Generated Data

Title

Untitled (stylized portrait of woman in black dress with decorative sash sitting on stairs)

People

Artist: Martin Schweig, American 20th century

Date

1920-1940, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10267

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Person 99.1
Human 99.1
Handrail 97.6
Banister 97.6
Indoors 92.7
Interior Design 92.7
Apparel 89.9
Clothing 89.9
Home Decor 89
Female 71.4
Flooring 70.7
Staircase 69.7
Furniture 64.1
Bed 64.1
Gown 61.7
Evening Dress 61.7
Fashion 61.7
Robe 61.7
Poster 59.3
Advertisement 59.3
Collage 59.3
Girl 59.2
Electronics 58.9
LCD Screen 58.9
Screen 58.9
Display 58.9
Monitor 58.9
Person 58.3
Bedroom 57.3
Room 57.3
Floor 56.1

Clarifai
created on 2019-11-16

people 99.6
adult 97.1
woman 95.8
indoors 95.8
man 95.5
one 94.4
furniture 93.9
room 93.8
two 92.3
monochrome 91.8
portrait 90.5
chair 85.3
movie 84.9
music 83.4
group 80.3
wear 80.2
actor 79.5
street 78.6
sit 78
window 77.9

Imagga
created on 2019-11-16

man 23.5
people 17.8
black 16.4
male 15.7
person 15.5
light 14
adult 13.7
silhouette 12.4
sitting 11.2
lifestyle 10.8
menorah 10.7
interior 10.6
youth 10.2
glass 10.1
groom 10
modern 9.8
fashion 9.8
style 9.6
urban 9.6
hair 9.5
women 9.5
window 9.5
studio 9.1
business 9.1
hand 9.1
room 8.8
portrait 8.4
human 8.2
indoor 8.2
photographer 8.1
music 8.1
musical instrument 8
home 8
businessman 7.9
happiness 7.8
candelabrum 7.7
happy 7.5
one 7.5
water 7.3
alone 7.3
art 7.3
metal 7.2
sexy 7.2
dress 7.2
sunset 7.2
shadow 7.2
device 7.1

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

text 95.6
black and white 79.8
person 73
woman 69.8
clothing 57.8
dress 53.2

Color Analysis

Face analysis

Amazon

Microsoft

AWS Rekognition

Age 6-16
Gender Male, 52.1%
Angry 45.8%
Happy 45%
Disgusted 45%
Calm 53.8%
Fear 45%
Surprised 45%
Sad 45.3%
Confused 45%

AWS Rekognition

Age 7-17
Gender Female, 50.5%
Disgusted 49.5%
Sad 49.6%
Confused 49.5%
Happy 49.5%
Fear 49.5%
Surprised 49.5%
Calm 50.3%
Angry 49.5%

Microsoft Cognitive Services

Age 7
Gender Female

Feature analysis

Amazon

Person 99.1%
Bed 64.1%

Categories

Imagga

interior objects 94.2%
food drinks 2.5%
paintings art 2.1%

Text analysis

Amazon

3-5.080