Human Generated Data

Title

Untitled (two photographs: rephotographed nineteenth-century portrait of couple with four children; rephotographed nineteenth-century portrait of young girl standing by chair)

Date

1925-1940, printed later

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10309

Human Generated Data

Title

Untitled (two photographs: rephotographed nineteenth-century portrait of couple with four children; rephotographed nineteenth-century portrait of young girl standing by chair)

People

Artist: Martin Schweig, American 20th century

Date

1925-1940, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10309

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Human 95.7
Art 93.5
Person 93.2
Drawing 92.1
Text 75.4
Sketch 72.3
Doodle 60.1
Advertisement 59.2
Poster 59.1
Label 59
Canvas 57.1
Painting 55.7

Clarifai
created on 2019-11-16

people 99.8
print 98.5
group 98.2
illustration 97.5
art 97.3
man 97.1
adult 95.8
retro 92.7
music 92.5
wear 91.6
woman 91
vintage 87.8
two 85.5
painting 84
military 81.3
old 79.4
bill 79.3
portrait 78
one 77.5
leader 77.2

Imagga
created on 2019-11-16

vintage 25.6
grunge 23.8
old 23
black 19.8
book jacket 19.2
device 16.9
texture 16.7
aged 15.4
jacket 14.9
design 14.7
equipment 14.2
retro 13.9
blackboard 13.9
antique 13
window 12.1
frame 11.8
dirty 11.7
decoration 11.7
graphic 11.7
art 11.3
wrapping 11.3
ancient 11.2
pattern 10.9
silhouette 10.8
drawing 10.5
wall 10.3
grungy 9.5
paper 9.4
letter 9.2
stamp 9.1
covering 9.1
paint 9
symbol 8.7
decorative 8.3
sketch 8.3
shredder 8.3
backboard 8.2
electronic equipment 8
structure 7.9
travel 7.7
newspaper 7.6
poster 7.5
sign 7.5
house 7.5
page 7.4
light 7.3
textured 7

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

drawing 99.9
sketch 99.6
cartoon 96.9
text 96.2
art 95.5
child art 95.2
painting 90.9
illustration 81.3
black and white 80.4
white 78.2
gallery 72.6
old 71.6
person 64.9
handwriting 60
picture frame 6.6

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 22-34
Gender Male, 54.6%
Happy 45%
Disgusted 45.4%
Angry 50.2%
Fear 45.1%
Calm 47.5%
Surprised 45.1%
Sad 46.3%
Confused 45.4%

AWS Rekognition

Age 0-3
Gender Female, 54.2%
Happy 46.7%
Surprised 45.5%
Angry 45.1%
Fear 51.9%
Sad 45.2%
Confused 45%
Calm 45.2%
Disgusted 45.4%

AWS Rekognition

Age 2-8
Gender Male, 50%
Angry 50.6%
Disgusted 45.1%
Happy 45%
Calm 45.4%
Sad 48.7%
Surprised 45%
Fear 45.1%
Confused 45.1%

AWS Rekognition

Age 9-19
Gender Female, 50.7%
Calm 47%
Happy 45%
Angry 45.2%
Disgusted 45%
Fear 45.1%
Sad 52.2%
Confused 45.5%
Surprised 45.1%

AWS Rekognition

Age 29-45
Gender Female, 50.8%
Angry 47.5%
Fear 45.7%
Calm 45.6%
Sad 48.8%
Disgusted 45.9%
Happy 45%
Confused 46.3%
Surprised 45.1%

AWS Rekognition

Age 25-39
Gender Male, 55%
Happy 45%
Angry 45.2%
Disgusted 45%
Surprised 45.1%
Sad 45%
Fear 45%
Calm 54.6%
Confused 45%

AWS Rekognition

Age 4-12
Gender Female, 54.2%
Happy 45%
Sad 53.3%
Disgusted 45%
Surprised 45%
Fear 45.1%
Angry 45.9%
Confused 45.1%
Calm 45.5%

Microsoft Cognitive Services

Age 8
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 93.2%

Categories

Imagga

paintings art 96.9%
food drinks 2.7%