Human Generated Data

Title

Untitled (men and women listening to man play accordian)

Date

c. 1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8299

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (men and women listening to man play accordian)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8299

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 99.3
Human 99.3
Person 98.5
Person 98.4
Person 98.2
Person 97.7
Clothing 97.5
Apparel 97.5
Person 96.5
Person 96.1
Interior Design 96.1
Indoors 96.1
Person 95.1
Person 88.4
Person 87.9
Dress 85.8
Face 84.6
Person 84.6
Room 78.9
People 75.6
Chair 75.3
Furniture 75.3
Floor 73.7
Person 73.6
Child 70.4
Kid 70.4
Female 70.3
Person 66.9
Portrait 66.6
Photography 66.6
Photo 66.6
Shorts 65
Girl 63.7
Baby 60.7
Flooring 57.7
Text 57.6
Advertisement 55.5
Poster 55.3
Building 55.2

Clarifai
created on 2023-10-25

people 99.8
many 98.4
group together 98.1
group 97.9
man 95.9
adult 95.6
education 94.7
child 94.7
woman 93.2
wear 91.4
school 87.2
dancing 83.9
several 83.1
outfit 82.6
recreation 81.9
leader 81.5
indoors 80
uniform 79.1
music 77
boy 75.3

Imagga
created on 2022-01-08

person 31.8
people 29.5
nurse 27.4
male 24.8
man 23.5
men 19.7
group 19.3
adult 17.7
planner 17.1
business 17
businessman 15.9
black 15
couple 14.8
silhouette 13.2
party 12.9
women 12.6
crowd 12.5
portrait 12.3
clothing 12.2
happy 11.9
team 11.6
professional 11.5
stage 11.1
human 10.5
dance 10.3
room 10.2
work 10.2
two 10.2
life 10.1
lifestyle 10.1
hand 9.9
job 9.7
art 9.7
style 9.6
happiness 9.4
old 9
dress 9
fun 9
night 8.9
success 8.8
casual 8.5
worker 8.2
performer 7.9
patient 7.9
teacher 7.7
youth 7.7
elegance 7.5
fashion 7.5
dark 7.5
friendship 7.5
leisure 7.5
camera 7.4
occupation 7.3
design 7.3
brass 7.2
office 7.2
suit 7.2
home 7.2
celebration 7.2
holiday 7.2
family 7.1
love 7.1
indoors 7
together 7

Google
created on 2022-01-08

Font 82.1
Suit 78.6
Monochrome 74.4
Snapshot 74.3
Vintage clothing 72.7
Monochrome photography 72.5
Event 71.2
Team 70.2
Classic 67.4
Stock photography 65.9
History 65.4
Room 62.1
Art 60.4
Photo caption 59.6
Crew 57.6
Uniform 56
Circle 52.2
Photographic paper 50.9
Family 50.6

Microsoft
created on 2022-01-08

text 98.7
clothing 98.3
person 93.8
man 91.8
outdoor 90.4
woman 58.6
posing 44

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 28-38
Gender Male, 99.7%
Calm 93.5%
Sad 2.7%
Happy 1.1%
Confused 1%
Surprised 0.8%
Disgusted 0.4%
Angry 0.3%
Fear 0.3%

AWS Rekognition

Age 49-57
Gender Male, 93.4%
Calm 99.8%
Sad 0.1%
Surprised 0%
Happy 0%
Confused 0%
Disgusted 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 39-47
Gender Male, 99.7%
Calm 75%
Sad 20.2%
Happy 2.6%
Surprised 1.1%
Confused 0.5%
Disgusted 0.3%
Fear 0.2%
Angry 0.2%

AWS Rekognition

Age 50-58
Gender Female, 80.7%
Calm 100%
Surprised 0%
Happy 0%
Disgusted 0%
Angry 0%
Confused 0%
Sad 0%
Fear 0%

AWS Rekognition

Age 36-44
Gender Female, 51.7%
Calm 85.5%
Happy 5.1%
Sad 4.8%
Surprised 1.6%
Disgusted 0.9%
Confused 0.7%
Angry 0.7%
Fear 0.6%

AWS Rekognition

Age 45-53
Gender Female, 90.1%
Calm 98.9%
Confused 0.3%
Sad 0.3%
Angry 0.2%
Surprised 0.1%
Disgusted 0.1%
Fear 0.1%
Happy 0%

AWS Rekognition

Age 33-41
Gender Male, 94.1%
Calm 50.6%
Sad 24.5%
Happy 16.7%
Confused 4.2%
Disgusted 1.4%
Angry 1%
Fear 0.8%
Surprised 0.8%

AWS Rekognition

Age 28-38
Gender Female, 98.3%
Happy 55%
Calm 36.1%
Sad 4.1%
Confused 2.1%
Surprised 1.2%
Disgusted 0.7%
Angry 0.5%
Fear 0.3%

AWS Rekognition

Age 23-31
Gender Female, 93.4%
Happy 77.5%
Calm 16.9%
Sad 2.8%
Surprised 0.9%
Fear 0.9%
Angry 0.5%
Disgusted 0.4%
Confused 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Feature analysis

Amazon

Person 99.3%

Categories

Text analysis

Amazon

9691
9691.
EITW
VELV SVEETA EITW
SVEETA
VELV

Google

9691 9691 9691. PA
9691
9691.
PA