Human Generated Data

Title

Untitled (crowd standing beneath memorial statue in Fairmont Park)

Date

1941

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7147

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (crowd standing beneath memorial statue in Fairmont Park)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1941

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7147

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2021-12-15

Person 99.7
Human 99.7
Person 99.5
Person 99.4
Person 93.7
Clothing 93.1
Apparel 93.1
Person 90.4
Person 85.4
Person 84.1
Person 82.4
Crowd 80.3
Person 78.1
Shorts 73.9
Face 73.3
People 73.1
Text 71.1
Drawing 67.8
Art 67.8
Person 65.2
Photo 62.4
Photography 62.4
Footwear 59.1
Shoe 59.1
Person 57.6
Sailor Suit 57.3
Symbol 56.3
Parade 55.7
Sketch 55
Shoe 52.3

Clarifai
created on 2023-10-15

people 99.6
man 98.1
woman 97.2
adult 93.8
group 93.3
many 91.9
veil 90.6
dancing 89.7
retro 87.1
leader 82.9
administration 81.1
partnership 80.8
crowd 79.8
music 79.7
group together 78.7
monochrome 78.5
wear 77.4
lid 76.5
sit 75.8
enjoyment 75.1

Imagga
created on 2021-12-15

sketch 100
drawing 87
representation 57.3
art 29.9
design 24.8
grunge 24.7
retro 23
silhouette 20.7
floral 19.6
frame 19.1
cartoon 18.7
graphic 17.5
pattern 17.1
elegance 16.8
vintage 16.6
style 16.3
decoration 14.7
clip art 13.9
flower 13.9
banner 13.8
elegant 13.7
fashion 13.6
decorative 13.4
old 12.6
leaf 12.5
poster 12.3
people 12.3
holiday 12.2
ornament 12.1
shape 11.9
texture 11.8
card 11.2
man 10.8
person 10.7
plant 10.6
party 10.3
black 10.2
symbol 10.1
paint 10
curve 9.6
model 9.3
border 9.1
menu 9
creative 8.8
light 8.7
artistic 8.7
antique 8.7
line 8.6
paper 8.4
modern 8.4
painting 8.4
color 8.4
element 8.3
sport 8.2
ornate 8.2
fun 8.2
dirty 8.1
lady 8.1
ink 7.7
scroll 7.6
curl 7.6
drink 7.5
elements 7.4
greeting 7.4
backdrop 7.4
coffee 7.4
negative 7.4
splash 7.4
dress 7.2
star 7.2
celebration 7.2
hair 7.1
posing 7.1
summer 7.1

Google
created on 2021-12-15

Microsoft
created on 2021-12-15

text 99
drawing 97.5
sketch 94.8
cartoon 84.8
clothing 77.1
person 69.1
black and white 58.8
man 53.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 44-62
Gender Male, 82%
Sad 48.4%
Calm 46.4%
Confused 2.7%
Happy 1.2%
Angry 0.6%
Surprised 0.5%
Disgusted 0.2%
Fear 0.1%

AWS Rekognition

Age 29-45
Gender Female, 51.1%
Calm 94.7%
Sad 4.2%
Confused 0.5%
Angry 0.2%
Happy 0.2%
Surprised 0.1%
Disgusted 0.1%
Fear 0%

AWS Rekognition

Age 47-65
Gender Male, 94.3%
Sad 45.5%
Calm 45.2%
Confused 6.2%
Angry 1.3%
Disgusted 0.8%
Surprised 0.4%
Happy 0.3%
Fear 0.2%

AWS Rekognition

Age 32-48
Gender Male, 98.1%
Sad 51.6%
Angry 19.9%
Calm 19%
Surprised 5%
Confused 2.6%
Fear 1.7%
Disgusted 0.1%
Happy 0.1%

AWS Rekognition

Age 13-25
Gender Male, 91.8%
Calm 78.1%
Sad 7.7%
Happy 5.3%
Confused 3.8%
Surprised 2.7%
Fear 1.5%
Angry 0.5%
Disgusted 0.4%

AWS Rekognition

Age 19-31
Gender Male, 87.9%
Sad 97.3%
Calm 2.4%
Confused 0.1%
Angry 0.1%
Fear 0.1%
Happy 0%
Surprised 0%
Disgusted 0%

AWS Rekognition

Age 21-33
Gender Male, 78.3%
Calm 88%
Happy 8.2%
Sad 3%
Angry 0.3%
Fear 0.2%
Confused 0.2%
Disgusted 0.1%
Surprised 0.1%

AWS Rekognition

Age 26-40
Gender Female, 56.6%
Calm 50.5%
Happy 42.4%
Sad 3.8%
Angry 1.1%
Fear 0.8%
Surprised 0.8%
Disgusted 0.4%
Confused 0.4%

AWS Rekognition

Age 15-27
Gender Male, 81.4%
Calm 80.5%
Happy 11.3%
Sad 3.3%
Angry 2.4%
Disgusted 1%
Fear 0.7%
Surprised 0.5%
Confused 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Unlikely

Feature analysis

Amazon

Person 99.7%
Shoe 59.1%

Categories

Imagga

paintings art 100%

Text analysis

Amazon

17429
17429.
RV

Google

174297JA2-MATP 17429.
174297JA2-MATP
17429.