Human Generated Data

Title

Untitled (group of people seated on porch steps with accordian)

Date

1949

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10597

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (group of people seated on porch steps with accordian)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1949

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10597

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.7
Human 99.7
Person 99.5
Person 99.2
Person 96.1
Person 94.2
Person 93.7
Person 93.7
Shoe 93.1
Footwear 93.1
Clothing 93.1
Apparel 93.1
People 85
Musical Instrument 83.7
Female 80
Person 67.6
Girl 65.7
Dress 64.2
Woman 57.6
Suit 56.6
Coat 56.6
Overcoat 56.6
Person 51.2

Clarifai
created on 2023-10-25

people 100
group 99.3
child 98.8
adult 98.1
group together 97.9
many 97.6
man 97.5
woman 96.5
education 96.5
school 95
boy 94.8
administration 91.2
leader 87.5
music 86.8
sit 86.6
several 83.9
wear 83.6
chair 82.7
adolescent 82.4
elementary school 81.9

Imagga
created on 2022-01-09

sax 41.4
brass 40.7
wind instrument 40
man 26.2
male 24.8
musical instrument 24.5
silhouette 23.2
trombone 20.8
person 19.4
people 18.9
cornet 17.9
business 15.2
black 15
businessman 15
sport 14.9
adult 14.4
group 12.1
men 12
player 10.9
exercise 10.9
sky 10.8
shadow 9.8
human 9.7
job 9.7
portrait 9.7
stage 9.3
active 9.3
fitness 9
office 9
success 8.8
body 8.8
athlete 8.7
women 8.7
boy 8.7
light 8.7
lifestyle 8.7
boss 8.6
dark 8.3
city 8.3
building 8.2
speedway 8.2
suit 8.1
activity 8.1
professional 8
drawing 7.8
work 7.8
play 7.8
motion 7.7
construction 7.7
crowd 7.7
employee 7.6
chart 7.6
two 7.6
power 7.6
racetrack 7.5
fun 7.5
outdoors 7.5
holding 7.4
action 7.4
symbol 7.4
device 7.3
competition 7.3
design 7.3
alone 7.3
team 7.2
together 7

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

building 99.6
text 96.5
outdoor 90.8
person 89.6
clothing 85.3
dance 80
black and white 71.6
posing 36.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 33-41
Gender Male, 60.9%
Sad 72.4%
Confused 25.8%
Angry 0.5%
Surprised 0.4%
Calm 0.3%
Happy 0.3%
Disgusted 0.2%
Fear 0.1%

AWS Rekognition

Age 48-54
Gender Male, 99.5%
Sad 79.4%
Confused 9.7%
Surprised 4.2%
Angry 3.4%
Calm 1.7%
Disgusted 0.7%
Happy 0.5%
Fear 0.4%

AWS Rekognition

Age 45-51
Gender Female, 93.7%
Happy 83.1%
Calm 7.7%
Fear 2.5%
Disgusted 1.6%
Surprised 1.5%
Sad 1.2%
Angry 1.2%
Confused 1.1%

AWS Rekognition

Age 33-41
Gender Male, 98.6%
Calm 49.2%
Disgusted 21.7%
Sad 15.3%
Surprised 4.5%
Happy 3.5%
Confused 3%
Angry 2.2%
Fear 0.7%

AWS Rekognition

Age 33-41
Gender Male, 96.5%
Happy 44.3%
Sad 23.2%
Confused 10.8%
Disgusted 9.4%
Fear 4.2%
Surprised 3.5%
Angry 2.3%
Calm 2.2%

AWS Rekognition

Age 48-56
Gender Female, 59.9%
Sad 60.2%
Calm 31.1%
Happy 2.7%
Confused 1.7%
Fear 1.5%
Angry 1.2%
Disgusted 1.1%
Surprised 0.5%

AWS Rekognition

Age 29-39
Gender Female, 87.8%
Happy 36.2%
Surprised 20.7%
Calm 13.4%
Fear 12.7%
Angry 7%
Sad 4.6%
Disgusted 3%
Confused 2.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Shoe 93.1%

Categories

Imagga

paintings art 89.7%
people portraits 8.3%

Text analysis

Amazon

5