Human Generated Data

Title

Untitled (four women in front of hammer and sickle banner)

Date

c. 1950

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4515

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (four women in front of hammer and sickle banner)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4515

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99.4
Human 99.4
Person 99
Person 98.8
Clothing 90.2
Apparel 90.2
Person 85.5
Person 84.1
Face 80
Female 72.5
Clinic 72.3
People 68.2
Text 66.4
Doctor 63.1
Girl 61
Advertisement 59
Photography 58.5
Photo 58.5
Woman 58.1
Performer 57.7
Poster 57.4

Clarifai
created on 2023-10-26

people 99.9
monochrome 99.4
adult 99
woman 98.8
group 98.6
man 97.4
music 96.7
child 92.4
facial expression 90
nostalgia 89.9
musician 89.7
indoors 87.7
retro 86.3
three 84.3
actress 83.9
wear 83.4
sit 82.4
portrait 81.3
two 81.3
administration 81

Imagga
created on 2022-01-23

man 31.6
person 29.3
male 27.6
people 21.2
business 20
musical instrument 17.1
businessman 16.8
adult 14.4
work 14.2
singer 14.1
brass 13.9
grunge 13.6
professional 13.5
wind instrument 12.9
musician 12.9
bass 12.1
music 11.8
team 11.6
stage 11.5
drawing 11.5
office 11.2
finance 11
blackboard 10.8
job 10.6
human 10.5
group 10.5
old 10.4
senior 10.3
silhouette 9.9
retro 9.8
performer 9.8
success 9.6
technology 9.6
design 9.6
education 9.5
play 9.5
men 9.4
paper 9.4
device 9.3
black 9
teacher 8.9
medical 8.8
happy 8.8
room 8.6
money 8.5
portrait 8.4
art 8.4
hand 8.3
teamwork 8.3
player 8.2
laptop 8.2
paint 8.1
computer 8
smiling 8
student 7.7
diagram 7.7
chart 7.6
communication 7.6
poster 7.5
vintage 7.4
mature 7.4
event 7.4
bank 7.4
graphic 7.3
aged 7.2
looking 7.2
suit 7.2
worker 7.2
smile 7.1
science 7.1
working 7.1
modern 7

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

text 99
clothing 92.4
person 91.1
woman 75.2
dance 69.6
smile 66.3
human face 63.9
posing 60.2
old 55.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 39-47
Gender Female, 98.6%
Happy 69.3%
Surprised 19.1%
Calm 3.1%
Sad 2.7%
Disgusted 2%
Confused 1.6%
Angry 1.4%
Fear 0.9%

AWS Rekognition

Age 47-53
Gender Female, 66.3%
Happy 78.4%
Sad 9.9%
Calm 7.9%
Confused 2%
Surprised 0.8%
Disgusted 0.4%
Angry 0.4%
Fear 0.2%

AWS Rekognition

Age 40-48
Gender Male, 96.5%
Calm 91.4%
Happy 7%
Confused 0.4%
Sad 0.4%
Disgusted 0.3%
Angry 0.2%
Surprised 0.2%
Fear 0.1%

AWS Rekognition

Age 48-56
Gender Female, 84.7%
Happy 71%
Surprised 8%
Calm 6.9%
Confused 5.9%
Disgusted 3%
Fear 2.8%
Angry 1.3%
Sad 1.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%

Categories

Imagga

paintings art 100%

Text analysis

Amazon

21433
star
NAGON-YTERAS-
mord

Google

stad NAGON-YT37A2 eieiei
stad
NAGON-YT37A2
eieiei