Human Generated Data

Title

Untitled (men and women seated on couches and chairs)

Date

1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8460

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (men and women seated on couches and chairs)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8460

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Person 99.3
Human 99.3
Person 98.7
Person 98.4
Person 98.1
Person 98
Person 97.7
Person 97.6
Chair 96.8
Furniture 96.8
Sitting 93.5
Tie 89.7
Accessories 89.7
Accessory 89.7
Plant 81.5
Flower 80.6
Blossom 80.6
Flooring 78.4
Clothing 77.6
Apparel 77.6
Flower Arrangement 75.9
People 72.1
Photography 64.8
Photo 64.8
Portrait 60.8
Face 60.8
Indoors 59.4
Flower Bouquet 56.9
Ikebana 56.9
Ornament 56.9
Vase 56.9
Pottery 56.9
Jar 56.9
Art 56.9
Suit 56.8
Coat 56.8
Overcoat 56.8
Couch 56.2

Clarifai
created on 2023-10-26

people 99.9
group 99.3
adult 97.9
group together 96.6
leader 95.9
wear 95.5
woman 95.4
man 95.2
education 94.9
many 93.5
sit 92.1
chair 91.4
administration 90.4
sitting 90.2
teacher 88.8
several 86.2
furniture 86.1
outfit 86.1
child 85.7
rehearsal 82.6

Imagga
created on 2022-01-15

brass 100
wind instrument 100
musical instrument 71.3
cornet 34
male 32.6
man 32.2
people 30.7
businessman 29.1
business 27.9
person 26.3
adult 26
group 25
men 21.5
office 19.3
meeting 17.9
businesswoman 17.3
corporate 17.2
team 17
couple 16.5
happy 16.3
job 15.9
executive 15.7
professional 14.7
black 14.4
businesspeople 14.2
women 13.4
work 13.3
suit 12.6
chair 12.3
indoors 12.3
smiling 12.3
desk 12.3
together 12.3
education 12.1
table 12.1
teamwork 12
laptop 11.8
room 11.3
modern 11.2
indoor 10.9
lifestyle 10.8
worker 10.8
conference 10.7
interior 10.6
boss 10.5
success 10.5
looking 10.4
manager 10.2
communication 10.1
smile 10
colleagues 9.7
sitting 9.4
teacher 9.3
study 9.3
casual 9.3
classroom 9.2
portrait 9.1
computer 8.8
employee 8.8
attractive 8.4
device 8.3
holding 8.3
successful 8.2
student 8.1
handsome 8
home 8
to 8
working 7.9
coworkers 7.9
happiness 7.8
boy 7.8
hall 7.8
students 7.8
blackboard 7.8
employment 7.7
class 7.7
elegant 7.7
youth 7.7
talking 7.6
silhouette 7.4
cheerful 7.3
girls 7.3
new 7.3
board 7.2
trombone 7.2
horn 7.2
school 7.2

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

person 99.2
wall 96
text 95.7
table 92.7
furniture 92.5
chair 84.5
clothing 84.1
old 59.5
posing 58.9
man 57.5

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 47-53
Gender Male, 88.1%
Sad 62.1%
Calm 33.2%
Confused 1.1%
Angry 1.1%
Disgusted 0.9%
Surprised 0.7%
Happy 0.4%
Fear 0.4%

AWS Rekognition

Age 26-36
Gender Male, 99.8%
Calm 97.4%
Sad 2%
Surprised 0.5%
Disgusted 0%
Angry 0%
Confused 0%
Happy 0%
Fear 0%

AWS Rekognition

Age 51-59
Gender Male, 99.4%
Calm 99%
Surprised 0.6%
Sad 0.2%
Disgusted 0.1%
Confused 0.1%
Happy 0.1%
Fear 0.1%
Angry 0%

AWS Rekognition

Age 51-59
Gender Female, 66.8%
Calm 98%
Surprised 0.6%
Sad 0.6%
Happy 0.2%
Angry 0.2%
Confused 0.2%
Disgusted 0.2%
Fear 0%

AWS Rekognition

Age 48-56
Gender Male, 98.5%
Calm 64.2%
Sad 28.4%
Happy 2.9%
Confused 2.8%
Disgusted 0.5%
Angry 0.4%
Surprised 0.4%
Fear 0.3%

AWS Rekognition

Age 45-51
Gender Female, 97%
Calm 94.9%
Sad 4.5%
Angry 0.2%
Happy 0.1%
Surprised 0.1%
Disgusted 0.1%
Confused 0.1%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%
Tie 89.7%

Text analysis

Amazon

14496

Google

449 T4996.
449
T4996.