Human Generated Data

Title

Untitled (fiftieth anniversary portrait of couple sitting on couch under sign)

Date

1937

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.3794

Human Generated Data

Title

Untitled (fiftieth anniversary portrait of couple sitting on couch under sign)

People

Artist: Durette Studio, American 20th century

Date

1937

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.3794

Machine Generated Data

Tags

Amazon
created on 2019-06-01

Person 98.6
Human 98.6
Person 98.5
Person 98.2
Person 76.8
Footwear 75.2
Shoe 75.2
Clothing 75.2
Apparel 75.2
People 74.4
Art 73.6
Flower 69.7
Blossom 69.7
Plant 69.7
Flower Arrangement 65.3
Face 64.9
Photo 61.5
Portrait 61.5
Photography 61.5
Flower Bouquet 60.7
Person 60.4
Drawing 59

Clarifai
created on 2019-06-01

people 99.9
group 98.4
adult 98.2
man 96.3
chair 93.3
many 92.5
furniture 91.9
leader 91.2
illustration 89.5
music 88
woman 84.9
administration 84.4
one 84.2
two 84.1
print 84
several 83.3
room 79.6
veil 78.8
wear 77.3
group together 76

Imagga
created on 2019-06-01

grunge 28.1
drawing 22.3
design 20.8
art 20.4
retro 19.7
frame 19.2
graphic 19
silhouette 17.4
vintage 16.5
flower 16.1
pattern 15
floral 14.5
decoration 14
structure 13.7
dirty 13.6
old 13.2
sketch 13.2
texture 13.2
representation 11.9
creative 11.5
style 11.1
billboard 11.1
symbol 10.1
paint 10
cool 9.8
poster 9.4
leaf 9.3
house 9.2
element 9.1
painting 9
signboard 9
plant 9
antique 8.9
object 8.8
holiday 8.6
business 8.5
wallpaper 8.4
decorative 8.3
creation 8.3
letter 8.3
aged 8.1
shape 8.1
man 8.1
water 8
text 7.9
paper 7.8
ornament 7.8
stain 7.7
money 7.7
elegance 7.6
city 7.5
building 7.4
speed 7.3
artwork 7.3
cash 7.3
color 7.2
black 7.2
celebration 7.2
life 7.1
travel 7
card 7
season 7

Google
created on 2019-06-01

Microsoft
created on 2019-06-01

text 90.8
drawing 83.1
person 71.4
clothing 68.4
old 61.9
sketch 59.9
man 57.2
flower 52.1
painting 34.3
painted 23.3

Color Analysis

Face analysis

Amazon

Microsoft

AWS Rekognition

Age 57-77
Gender Male, 54.5%
Disgusted 45.1%
Calm 53.6%
Angry 45.1%
Sad 45.2%
Happy 45.4%
Surprised 45.3%
Confused 45.3%

AWS Rekognition

Age 35-52
Gender Male, 54.9%
Calm 53.7%
Surprised 45.3%
Sad 45.1%
Confused 45.2%
Disgusted 45.1%
Happy 45.4%
Angry 45.2%

AWS Rekognition

Age 20-38
Gender Male, 54.5%
Sad 50.1%
Surprised 45.1%
Happy 45%
Angry 45.2%
Calm 49.2%
Confused 45.3%
Disgusted 45%

AWS Rekognition

Age 26-43
Gender Female, 54.7%
Confused 45.6%
Surprised 45.5%
Sad 46.7%
Angry 45.7%
Happy 46.9%
Calm 49.3%
Disgusted 45.3%

Microsoft Cognitive Services

Age 60
Gender Male

Feature analysis

Amazon

Person 98.6%
Shoe 75.2%

Categories

Imagga

paintings art 99.9%

Captions

Text analysis

Amazon

61887
61887 1957
1957
ves
alnes
ves ournle alnes
ournle
Cae