Human Generated Data

Title

Untitled (men and women seated near ledge in Fairmont Park, PA)

Date

1941

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8532

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (men and women seated near ledge in Fairmont Park, PA)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1941

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8532

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.2
Human 99.2
Person 98.9
Person 98.6
Person 98.5
Person 97.9
Person 97.4
Person 96.9
Person 96.6
Person 95.7
Person 91.2
Crowd 90.8
Person 84.5
People 81
Silhouette 80.8
Person 80.6
Person 80.2
Audience 78.8
Clothing 69.4
Apparel 69.4
Leisure Activities 61.6
Musician 59.8
Musical Instrument 59.8
Sitting 58.9
Suit 58.6
Coat 58.6
Overcoat 58.6
Chair 55.9
Furniture 55.9
Music Band 55.5
Stage 55.4
Person 45.7

Clarifai
created on 2023-10-25

people 99.8
group 98.9
many 97.2
group together 96.6
music 95.7
man 95.3
woman 92.9
adult 92.8
musician 91.3
child 90.4
leader 88
audience 85.7
education 82.2
singer 80.9
crowd 78.3
microphone 77.3
furniture 74.8
administration 74.5
speaker 71.6
room 71.4

Imagga
created on 2022-01-09

man 34.9
male 31.9
people 31.8
person 29.5
business 27.3
group 25
musical instrument 24.7
adult 22.6
businessman 22.1
men 20.6
office 19.6
stage 18.7
chair 18.1
corporate 18
professional 17.1
wind instrument 16.7
silhouette 16.6
brass 16
meeting 16
job 15.9
laptop 15.8
modern 15.4
teacher 15.4
lifestyle 15.2
communication 15.1
work 14.9
table 14.7
room 14.6
executive 14.5
classroom 14.4
black 13.9
happy 13.8
sitting 13.7
women 13.4
worker 13.4
indoors 13.2
businesswoman 12.7
team 12.5
platform 12
employee 11.6
percussion instrument 11.6
photographer 11.5
music 10.9
musician 10.8
computer 10.5
success 10.5
couple 10.4
indoor 10
attractive 9.8
interior 9.7
hands 9.6
education 9.5
smiling 9.4
singer 9.3
smile 9.3
board 9.1
suit 9
device 8.9
night 8.9
together 8.8
love 8.7
class 8.7
happiness 8.6
boss 8.6
glass 8.6
businesspeople 8.5
youth 8.5
hall 8.4
blackboard 8.4
teamwork 8.3
leisure 8.3
holding 8.3
performer 8.2
technology 8.2
style 8.2
handsome 8
looking 8
life 8
boy 7.8
students 7.8
party 7.7
diversity 7.7
crowd 7.7
casual 7.6
two 7.6
talking 7.6
tie 7.6
relax 7.6
career 7.6
desk 7.6
fashion 7.5
building 7.5
fun 7.5
manager 7.4
student 7.4
window 7.3
cheerful 7.3
teenager 7.3
confident 7.3
cornet 7.2
portrait 7.1
to 7.1
microphone 7.1

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

text 99.3
person 95.9
clothing 95.3
man 85.6
people 75.4
musical instrument 74.1
black 71.9
black and white 69.3
concert 68.9
group 60.9
posing 48

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 34-42
Gender Male, 98.8%
Calm 76.5%
Sad 7.5%
Happy 7.3%
Confused 5.4%
Disgusted 1.4%
Surprised 1.2%
Angry 0.6%
Fear 0.2%

AWS Rekognition

Age 45-53
Gender Female, 60.7%
Calm 83.2%
Surprised 8%
Happy 2.5%
Fear 2.2%
Sad 1.4%
Disgusted 1.3%
Confused 0.7%
Angry 0.7%

AWS Rekognition

Age 35-43
Gender Male, 97.1%
Sad 83.8%
Calm 14.9%
Confused 0.4%
Disgusted 0.3%
Happy 0.3%
Angry 0.2%
Fear 0.1%
Surprised 0.1%

AWS Rekognition

Age 48-54
Gender Male, 93%
Happy 35.3%
Sad 28%
Confused 16.4%
Calm 9.3%
Disgusted 5.3%
Angry 2.3%
Fear 2%
Surprised 1.5%

AWS Rekognition

Age 24-34
Gender Female, 52.8%
Sad 72.1%
Confused 8.7%
Happy 6.6%
Surprised 5.8%
Calm 2.1%
Fear 2.1%
Disgusted 1.6%
Angry 1.1%

AWS Rekognition

Age 52-60
Gender Male, 93.3%
Calm 100%
Sad 0%
Happy 0%
Surprised 0%
Disgusted 0%
Confused 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 22-30
Gender Male, 51.3%
Calm 93.9%
Sad 1.6%
Surprised 1.2%
Happy 1.1%
Disgusted 0.6%
Fear 0.6%
Confused 0.5%
Angry 0.5%

AWS Rekognition

Age 31-41
Gender Male, 53.1%
Calm 77.7%
Surprised 6.7%
Disgusted 5.2%
Sad 3.9%
Angry 2%
Fear 1.8%
Confused 1.5%
Happy 1.1%

AWS Rekognition

Age 28-38
Gender Female, 99.9%
Calm 50.3%
Sad 34.2%
Happy 12.3%
Confused 1.4%
Fear 0.9%
Disgusted 0.5%
Surprised 0.3%
Angry 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%
Chair 55.9%

Text analysis

Amazon

19433
17433.
KODVA

Google

17433. בירד
17433.
בירד