Human Generated Data

Title

Untitled (street performers singing in park)

Date

c. 1950

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7560

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (street performers singing in park)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7560

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 99.7
Human 99.7
Person 99
Person 97.8
Person 96.1
Person 95.5
Dress 94.7
Clothing 94.7
Apparel 94.7
Tree 91
Plant 91
Play 89.8
Person 89.5
Vegetation 89.2
Shorts 87.9
Grass 85.5
Outdoors 83.6
Female 80.1
Meal 78.5
Food 78.5
Face 78.5
Leisure Activities 78.4
Person 77.5
Shoe 76.6
Footwear 76.6
Yard 73.3
Nature 73.3
Person 73
Person 72
Guitar 71.4
Musical Instrument 71.4
Girl 69
Water 67.7
Land 67.6
Musician 67.1
People 64.6
Kid 63.2
Child 63.2
Poster 63
Advertisement 63
Collage 62.3
Suit 59.1
Coat 59.1
Overcoat 59.1
Vacation 56.5
Pool 56
Woman 56
Person 45.3

Clarifai
created on 2023-10-25

people 99.9
child 98.4
monochrome 97.8
adult 97.4
group 96.5
group together 96.4
woman 96
man 95.6
many 94.5
music 90.8
boy 89.3
street 87.1
administration 86.2
several 84.5
crowd 83.1
leader 82.2
musician 81.8
art 80
war 79.7
recreation 79.3

Imagga
created on 2022-01-08

adult 19.6
person 19.4
people 18.4
fountain 17.1
sport 16.4
sexy 14.5
human 14.2
posing 14.2
one 14.2
fashion 13.6
attractive 13.3
portrait 12.9
man 12.8
black 12.6
structure 12.4
women 11.9
sensuality 11.8
umbrella 11.8
dress 11.7
model 11.7
cute 11.5
outdoor 11.5
lady 11.4
water 11.3
skin 11
dark 10.9
pretty 10.5
body 10.4
hair 10.3
wall 10.3
teenager 10
joy 10
cool 9.8
fun 9.7
outdoors 9.5
lifestyle 9.4
face 9.2
male 9.2
fitness 9
light 9
wet 8.9
style 8.9
happy 8.8
urban 8.7
expression 8.5
action 8.4
relaxation 8.4
summer 8.4
weapon 8.4
city 8.3
playing 8.2
healthy 8.2
exercise 8.2
dirty 8.1
musical instrument 8
canopy 8
day 7.8
happiness 7.8
standing 7.8
world 7.7
clothing 7.7
head 7.6
danger 7.3

Google
created on 2022-01-08

Drum 93.2
Plant 91.3
Musical instrument 89.6
Human 89.4
Organism 86.9
Black-and-white 86.8
Style 84
Tree 82.5
Musician 82.3
Chair 79.5
Adaptation 79.4
Guitar 78.1
Monochrome 77.7
Art 77.6
Monochrome photography 76.2
Entertainment 74
Font 73.7
Event 70.6
Room 69
Visual arts 66

Microsoft
created on 2022-01-08

text 99.8
clothing 97.1
person 96.8
footwear 93.5
black and white 92.3
outdoor 88.4
man 85.9
tree 83.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 37-45
Gender Male, 99.6%
Calm 99.8%
Surprised 0%
Disgusted 0%
Confused 0%
Sad 0%
Angry 0%
Happy 0%
Fear 0%

AWS Rekognition

Age 29-39
Gender Male, 95.2%
Calm 62.7%
Confused 10.6%
Surprised 10.4%
Sad 4.7%
Angry 4.2%
Fear 4.1%
Happy 2.4%
Disgusted 1%

AWS Rekognition

Age 16-22
Gender Male, 97.7%
Fear 93.4%
Sad 2.4%
Surprised 1.9%
Confused 1%
Disgusted 0.4%
Happy 0.3%
Calm 0.3%
Angry 0.3%

AWS Rekognition

Age 45-53
Gender Male, 99.8%
Calm 86.6%
Confused 3.8%
Sad 3.5%
Disgusted 2.4%
Surprised 1.5%
Angry 0.9%
Happy 0.9%
Fear 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Shoe 76.6%
Guitar 71.4%

Categories

Captions

Text analysis

Amazon

28717.
1
NAGOX

Google

28717.
28717.