Human Generated Data

Title

Untitled (men and women seated around formal dining room table)

Date

1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8444

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (men and women seated around formal dining room table)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8444

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Person 99.2
Human 99.2
Person 98.7
Person 98
Person 97.9
Person 97.8
Potted Plant 97.7
Pottery 97.7
Plant 97.7
Vase 97.7
Jar 97.7
Person 96.6
Person 94.9
Interior Design 91
Indoors 91
Person 86
Flower 85.5
Blossom 85.5
Person 85.3
Person 84.4
Room 84.4
Furniture 84.2
Table 79
Flower Arrangement 75.2
Planter 75.2
Lamp 72.2
People 69.5
Apparel 68
Clothing 68
Herbal 67.8
Herbs 67.8
Tree 65.1
Ikebana 64.7
Ornament 64.7
Art 64.7
Photography 63.2
Photo 63.2
Chair 62
Dining Table 60.3
Living Room 59.7
Outdoors 59.2
Crowd 58.5
Suit 57.6
Coat 57.6
Overcoat 57.6
Flower Bouquet 56.7
Person 46.4

Clarifai
created on 2023-10-26

people 99.9
group 99.3
many 98.6
administration 98.6
group together 97.9
woman 97.8
adult 97.3
man 95.4
leader 94.8
furniture 92.5
child 91.9
wear 89.9
several 89.8
actress 88.6
recreation 88.2
home 87.6
monochrome 86.1
music 83.4
war 82.4
chair 81.8

Imagga
created on 2022-01-15

room 46
classroom 44.7
man 26.2
person 26
people 25.6
male 22
teacher 18.9
home 18.3
men 17.1
work 16.5
group 16.1
table 15.6
indoors 14.9
adult 14.1
business 14
modern 13.3
interior 13.3
office 12.4
businessman 12.3
professional 11.6
smiling 11.6
couple 11.3
meeting 11.3
happy 11.3
education 11.2
senior 11.2
sitting 11.1
old 11.1
indoor 10.9
house 10.8
hall 10.5
women 10.3
chair 10.1
musical instrument 10
student 9.7
corporate 9.4
speaker 9.4
center 9.2
worker 9.1
wind instrument 9
building 9
cheerful 8.9
family 8.9
blackboard 8.9
job 8.8
brass 8.5
portrait 8.4
hand 8.3
teamwork 8.3
restaurant 8.3
life 8.2
team 8.1
computer 8
conference 7.8
colleagues 7.8
class 7.7
educator 7.6
desk 7.5
study 7.4
vintage 7.4
mature 7.4
board 7.3
new 7.3
lifestyle 7.2
shop 7.2
equipment 7.1
night 7.1
stage 7
happiness 7

Google
created on 2022-01-15

Plant 85.5
Black-and-white 85.1
Chair 84.2
Style 83.8
Musician 83.4
Art 81.4
Picture frame 77.1
Musical instrument 73.5
Suit 73.4
Font 73.3
Event 72.8
Monochrome 72.5
Monochrome photography 71.9
Room 70.1
Drum 70.1
Vintage clothing 68.5
History 64.8
Illustration 63.3
Visual arts 60.7
Music 60.3

Microsoft
created on 2022-01-15

text 96.3
person 92.4
concert 92.2
clothing 90.9
man 80.7
group 63.3
musical instrument 60.7
people 59.3
black and white 58.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 45-51
Gender Male, 59.9%
Calm 100%
Sad 0%
Happy 0%
Surprised 0%
Angry 0%
Disgusted 0%
Confused 0%
Fear 0%

AWS Rekognition

Age 45-53
Gender Male, 81%
Calm 99.7%
Sad 0.1%
Confused 0%
Surprised 0%
Disgusted 0%
Angry 0%
Happy 0%
Fear 0%

AWS Rekognition

Age 39-47
Gender Male, 74.4%
Happy 47.2%
Calm 37.1%
Sad 6.7%
Angry 4%
Confused 1.7%
Surprised 1.6%
Disgusted 1%
Fear 0.8%

AWS Rekognition

Age 49-57
Gender Male, 98.3%
Calm 95.4%
Sad 4%
Confused 0.3%
Disgusted 0.1%
Surprised 0.1%
Angry 0.1%
Fear 0%
Happy 0%

AWS Rekognition

Age 39-47
Gender Female, 57.1%
Calm 93.7%
Happy 3.6%
Sad 1.8%
Surprised 0.3%
Angry 0.2%
Confused 0.2%
Fear 0.1%
Disgusted 0.1%

AWS Rekognition

Age 28-38
Gender Female, 67.6%
Calm 89.5%
Happy 6.3%
Sad 1.2%
Surprised 1%
Confused 0.8%
Disgusted 0.6%
Fear 0.3%
Angry 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%

Text analysis

Amazon

12607
0
A70A
22399 A70A
22399

Google

12607
12607 12607.
12607.