Human Generated Data

Title

Untitled (group of people seated on porch steps with accordian)

Date

1949

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10593

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (group of people seated on porch steps with accordian)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1949

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10593

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.7
Human 99.7
Person 99.5
Person 99.3
Clothing 98.4
Shoe 98.4
Footwear 98.4
Apparel 98.4
Person 97.9
Person 97.7
Person 97.5
Person 96.3
Person 96.1
Person 95
Person 91
Musical Instrument 86.2
People 86.2
Person 77.6
Housing 72.3
Building 72.3
Accordion 70.2
Shoe 69.5
Shoe 64.1
Photography 61.9
Photo 61.9
Portrait 61.1
Face 61.1
Shoe 61.1
Shoe 55.4

Clarifai
created on 2023-10-25

people 100
group 99.8
child 99.2
many 98.5
group together 98.1
adult 97.9
education 97.6
woman 96.3
school 95.9
boy 95.1
man 95
administration 92.5
elementary school 91.5
teacher 91
several 90.9
music 90.2
outfit 90.2
wear 88.7
leader 88.6
family 86.1

Imagga
created on 2022-01-09

musical instrument 82.4
accordion 74
keyboard instrument 57.9
wind instrument 54.1
male 24.1
man 23.7
people 20.6
person 19.5
men 18
silhouette 17.4
adult 16.3
black 14.4
lifestyle 13.7
portrait 12.3
sport 11.7
fashion 11.3
group 11.3
sitting 11.2
business 10.9
room 10.9
casual 10.2
attractive 9.8
style 9.6
musician 9.3
music 9.1
art 9
outdoors 8.9
lady 8.9
businessman 8.8
player 8.8
device 8.7
women 8.7
youth 8.5
pretty 8.4
studio 8.4
leisure 8.3
teenager 8.2
exercise 8.2
sexy 8
posing 8
play 7.7
party 7.7
relaxation 7.5
happy 7.5
fun 7.5
graphic 7.3
pose 7.2
smiling 7.2
copy space 7.2
computer 7.2
job 7.1
chair 7.1
modern 7
together 7

Google
created on 2022-01-09

Window 90.6
Musical instrument 89.7
Black 89.5
Dress 85.9
Style 83.9
Black-and-white 83.7
Accordionist 78.5
Art 78.3
Folk instrument 77.1
Monochrome 73.2
Monochrome photography 73
Event 72.9
Vintage clothing 70.5
Accordion 70.1
Music 68.6
Crew 67.3
Font 66.3
Drum 66.1
Room 65.8
Fun 64.8

Microsoft
created on 2022-01-09

building 99.2
text 93.4
outdoor 91
window 86.9
person 73.4
clothing 70.8
posing 61.6
accordion 51.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 45-51
Gender Female, 83.6%
Happy 40%
Sad 16.8%
Confused 14.9%
Angry 13.1%
Surprised 6.5%
Calm 4.9%
Disgusted 2.9%
Fear 1%

AWS Rekognition

Age 38-46
Gender Male, 80.9%
Happy 94.2%
Calm 2%
Surprised 1.5%
Disgusted 1.1%
Angry 0.4%
Sad 0.4%
Confused 0.3%
Fear 0.1%

AWS Rekognition

Age 47-53
Gender Female, 51%
Surprised 32.5%
Sad 26.1%
Calm 16.9%
Confused 11.3%
Angry 4.4%
Happy 4%
Fear 2.7%
Disgusted 2.1%

AWS Rekognition

Age 50-58
Gender Male, 96.9%
Sad 98.1%
Happy 0.5%
Confused 0.4%
Fear 0.4%
Disgusted 0.2%
Calm 0.2%
Angry 0.2%
Surprised 0.1%

AWS Rekognition

Age 28-38
Gender Male, 97.4%
Calm 95.4%
Sad 1.5%
Angry 1.3%
Disgusted 0.6%
Confused 0.4%
Happy 0.3%
Surprised 0.3%
Fear 0.2%

AWS Rekognition

Age 34-42
Gender Male, 98.3%
Sad 50.2%
Surprised 20%
Confused 14.1%
Happy 6.4%
Calm 4.1%
Angry 2.3%
Disgusted 1.7%
Fear 1%

AWS Rekognition

Age 21-29
Gender Male, 99.9%
Sad 45%
Calm 36.5%
Confused 14.2%
Angry 2%
Disgusted 1.2%
Surprised 0.6%
Fear 0.3%
Happy 0.2%

AWS Rekognition

Age 42-50
Gender Female, 97.8%
Happy 64.6%
Confused 20.4%
Sad 3.8%
Calm 3.7%
Fear 3.3%
Surprised 1.8%
Angry 1.4%
Disgusted 1%

AWS Rekognition

Age 33-41
Gender Female, 57.8%
Happy 88.1%
Surprised 5.9%
Fear 1.8%
Sad 1.2%
Confused 0.9%
Angry 0.8%
Disgusted 0.7%
Calm 0.6%

AWS Rekognition

Age 50-58
Gender Male, 98.7%
Calm 92.7%
Sad 6.3%
Confused 0.3%
Angry 0.3%
Disgusted 0.2%
Surprised 0.2%
Fear 0.1%
Happy 0.1%

AWS Rekognition

Age 53-61
Gender Female, 98.8%
Happy 43.7%
Sad 21.1%
Calm 12.7%
Fear 11.4%
Confused 4.3%
Angry 3.3%
Disgusted 2%
Surprised 1.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Shoe 98.4%

Text analysis

Amazon

6
1