Human Generated Data

Title

Untitled (women standing near floral memorials)

Date

1942

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7126

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (women standing near floral memorials)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1942

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7126

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2021-12-15

Human 98.6
Person 98.6
Person 94.5
Clothing 91.8
Apparel 91.8
Person 88.3
Person 86.6
Person 85.8
Person 85.4
Plant 83.7
Blossom 76.7
Flower 76.7
People 74.2
Art 74
Graphics 73.4
Floral Design 72.2
Pattern 72.2
Drawing 67.7
Text 66.9
Costume 64.3
Robe 63.2
Fashion 63.2
Photography 63
Photo 63
Flower Arrangement 61.6
Gown 59.6
Wedding 58.1
Indoors 56.8
Face 56.7
Dress 56.6
Photo Booth 56
Coat 55.4
Suit 55.4
Overcoat 55.4
Person 53.3

Clarifai
created on 2023-10-15

people 99.5
group 96.2
man 96
adult 95.6
woman 88.8
monochrome 88.7
illustration 83.1
leader 83
desktop 82.2
wear 81.1
administration 79.5
retro 77.9
many 75.8
interaction 75
technology 73.8
spherical 73.6
old 72.4
wedding 71.7
paper 69.9
uniform 65.3

Imagga
created on 2021-12-15

drawing 42.8
sketch 40
design 25.9
art 24.2
grunge 22.2
pattern 20.5
representation 19.9
cartoon 19.6
silhouette 19.1
facility 18.3
negative 18.1
graphic 17.5
gymnasium 16.6
element 16.6
retro 15.6
decoration 15.4
frame 15
film 14.3
business 14
clip art 13
black 12.6
shape 12.6
athletic facility 12.5
sign 12
decorative 11.7
vintage 11.6
map 11.5
symbol 11.5
style 11.1
texture 11.1
photographic paper 11
old 10.5
plant 10.5
icon 10.3
finance 10.1
leaf 10.1
banner 10.1
paint 10
currency 9.9
team 9.9
backgrounds 9.7
ink 9.6
poster 9.4
man 9.4
floral 9.4
depository 9.4
product 9
collection 9
digital 8.9
bank 8.9
curve 8.8
antique 8.7
paper 8.6
treasury 8.6
money 8.5
arrow 8.5
set 8.5
modern 8.4
dollar 8.4
color 8.4
ornate 8.2
border 8.1
computer 8
people 7.8
creation 7.8
flower 7.7
wallpaper 7.7
outline 7.6
photographic equipment 7.4
equipment 7.3
person 7.3
dirty 7.2
holiday 7.2
male 7.1
summer 7.1

Google
created on 2021-12-15

Microsoft
created on 2021-12-15

text 97.8
black and white 93.3
wedding dress 91.3
window 86.7
bride 81.5
drawing 70
clothing 66.2
sketch 64.1
old 62.2
posing 58.6
dress 57.3
person 56.5
woman 55.5

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 24-38
Gender Male, 82.7%
Calm 80.9%
Sad 7.5%
Angry 2.7%
Disgusted 2.4%
Fear 2%
Confused 1.8%
Surprised 1.4%
Happy 1.3%

AWS Rekognition

Age 55-73
Gender Male, 91.1%
Calm 54.6%
Disgusted 23%
Surprised 9.2%
Confused 3.9%
Sad 3.2%
Angry 3%
Happy 1.9%
Fear 1.1%

AWS Rekognition

Age 50-68
Gender Female, 75%
Calm 93.9%
Happy 1.9%
Sad 1.6%
Surprised 1.2%
Angry 0.4%
Disgusted 0.4%
Confused 0.3%
Fear 0.3%

AWS Rekognition

Age 51-69
Gender Male, 98.4%
Calm 96.9%
Sad 2.1%
Angry 0.4%
Surprised 0.2%
Fear 0.2%
Happy 0.1%
Confused 0.1%
Disgusted 0.1%

AWS Rekognition

Age 48-66
Gender Male, 58.6%
Happy 36.5%
Calm 35.7%
Sad 15.5%
Angry 4.3%
Fear 3.3%
Confused 2.1%
Disgusted 1.6%
Surprised 1%

AWS Rekognition

Age 13-25
Gender Female, 51.8%
Calm 92.2%
Happy 4.3%
Sad 2.4%
Disgusted 0.5%
Angry 0.4%
Surprised 0.1%
Fear 0.1%
Confused 0.1%

AWS Rekognition

Age 23-35
Gender Male, 89.5%
Calm 55.8%
Disgusted 14.1%
Surprised 10.5%
Confused 6.4%
Sad 4.5%
Happy 4.2%
Angry 2.3%
Fear 2.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.6%

Categories

Imagga

paintings art 99.5%

Text analysis

Amazon

19608
NAGOY

Google

19
8
19608.
19 60 8 · 19608.
60
·