Human Generated Data

Title

Untitled (two photographs: woman in lacy dress posed leaning against piano; double portrait of man posing with two young daughters)

Date

1925-1940, printed later

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10305

Human Generated Data

Title

Untitled (two photographs: woman in lacy dress posed leaning against piano; double portrait of man posing with two young daughters)

People

Artist: Martin Schweig, American 20th century

Date

1925-1940, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10305

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Human 97.9
Person 97.9
Apparel 95.2
Clothing 95.2
Art 73.2
Poster 69.2
Advertisement 69.2
Collage 69.2
Female 68.2
Fashion 68.2
Robe 68.2
Evening Dress 68.2
Gown 68.2
Home Decor 68.1
Electronics 65.1
Screen 65.1
Monitor 62
Display 62
Person 61.7
Woman 60.9
LCD Screen 60.4
Plant 60.4

Clarifai
created on 2019-11-16

people 99.9
group 97.7
adult 97.2
two 97.2
wear 95.2
man 94.7
child 93
woman 92.9
group together 92.7
outfit 92.4
three 92.1
actress 91
one 90.1
monochrome 90.1
four 89.8
vehicle 88
several 87.4
room 86.8
furniture 86.7
veil 86.7

Imagga
created on 2019-11-16

barbershop 28
shop 25.8
mercantile establishment 18.9
kimono 18.5
boutique 17.2
architecture 16.4
robe 14.5
window 14.4
art 13.7
old 13.2
clothing 12.8
people 12.8
building 12.8
place of business 12.6
dress 11.7
statue 11.6
decoration 11.4
garment 11.4
detail 11.3
sculpture 10.5
black 10.5
man 10.1
city 10
religion 9.9
history 9.8
bride 9.6
wall 9.4
monument 9.3
historic 9.2
traditional 9.1
vintage 9.1
home 8.8
ancient 8.6
culture 8.5
business 8.5
portrait 8.4
house 8.4
groom 7.9
person 7.9
design 7.9
antique 7.8
elegance 7.6
tourism 7.4
color 7.2
colorful 7.2
travel 7

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

text 99.6
drawing 92.2
clothing 89.2
cartoon 88.6
person 88.6
sketch 81.9
building 81.2
black and white 67

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 31-47
Gender Male, 99.5%
Surprised 0.1%
Fear 0.1%
Disgusted 0.3%
Calm 95.4%
Angry 0.4%
Confused 1.7%
Happy 0.2%
Sad 1.9%

AWS Rekognition

Age 3-9
Gender Female, 55%
Angry 45%
Fear 45%
Calm 45.1%
Sad 45%
Disgusted 45%
Happy 54.8%
Confused 45%
Surprised 45%

AWS Rekognition

Age 3-11
Gender Female, 54.6%
Confused 45.2%
Angry 45.1%
Surprised 45.1%
Calm 54.2%
Disgusted 45.1%
Fear 45%
Sad 45.1%
Happy 45.2%

AWS Rekognition

Age 0-3
Gender Female, 52.4%
Confused 45%
Disgusted 45%
Surprised 45%
Calm 54.8%
Angry 45%
Sad 45.1%
Happy 45%
Fear 45%

AWS Rekognition

Age 23-35
Gender Male, 99.4%
Calm 75%
Sad 5.3%
Happy 0.2%
Angry 8.3%
Fear 0.1%
Surprised 0.6%
Confused 10%
Disgusted 0.6%

AWS Rekognition

Age 2-8
Gender Female, 53.6%
Confused 45.2%
Happy 50.3%
Fear 45%
Sad 45.2%
Surprised 45.1%
Angry 45.3%
Disgusted 45.1%
Calm 48.7%

AWS Rekognition

Age 13-23
Gender Female, 54.5%
Sad 45.1%
Disgusted 45%
Confused 45%
Surprised 45%
Happy 45%
Fear 45%
Angry 45%
Calm 54.8%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 97.9%

Categories

Imagga

paintings art 99.9%

Captions

Text analysis

Google

eare eroE
eare
eroE