Human Generated Data

Title

Untitled (people seated in chairs on slate patio)

Date

1952

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10639

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (people seated in chairs on slate patio)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1952

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10639

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Person 99.4
Human 99.4
Person 99.2
Person 99
Person 98.3
Person 97.2
Chair 96.9
Furniture 96.9
Footwear 95.7
Clothing 95.7
Apparel 95.7
Shoe 95.7
Person 95
Person 92.6
Leisure Activities 78.1
Sitting 72.8
Indoors 67.8
Room 67.3
Interior Design 66.8
Person 63.4
Musical Instrument 62
Shoe 61
Suit 58.9
Coat 58.9
Overcoat 58.9
Cafe 58
Restaurant 58

Clarifai
created on 2023-10-26

people 99.5
monochrome 96.3
man 96
woman 95.4
group 94.6
adult 94.4
sit 94.2
group together 93.9
chair 90.1
child 87.3
recreation 85.5
many 82.5
furniture 81.4
street 81.1
indoors 80.4
family 74.6
leader 74.6
several 74.1
wait 73.4
sitting 73.1

Imagga
created on 2022-01-15

room 25.7
table 25.4
brass 24.9
chair 24.1
wind instrument 22
restaurant 18.1
people 17.8
glass 16.9
musical instrument 16.5
classroom 16.1
male 14.9
man 14.8
sax 14.5
interior 14.1
indoors 14
person 13.7
dinner 13.6
seat 12.3
men 12
business 11.5
women 11.1
lifestyle 10.8
urban 10.5
dining 10.5
furniture 9.8
tables 9.8
outdoors 9.7
group 9.7
couple 9.6
party 9.4
hall 9.4
bass 9.3
building 9.2
modern 9.1
meal 9
chairs 8.8
cafeteria 8.8
scene 8.6
day 8.6
comfortable 8.6
sitting 8.6
device 8.5
eat 8.4
life 8.4
relaxation 8.4
floor 8.4
wedding 8.3
indoor 8.2
family 8
adult 7.9
black 7.8
education 7.8
empty 7.7
lunch 7.7
setting 7.7
elegant 7.7
outside 7.7
outdoor 7.6
school 7.6
wine 7.6
drink 7.5
city 7.5
event 7.4
food 7.3
home 7.2
decor 7.1
happiness 7
banquet 7
travel 7
together 7

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

text 98.3
furniture 95.5
chair 95.4
person 87.6
table 87.6
clothing 81
black and white 73.9
several 10.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 34-42
Gender Female, 59.9%
Happy 52.2%
Sad 37%
Calm 4.1%
Confused 3.4%
Disgusted 1.1%
Angry 1%
Surprised 0.8%
Fear 0.5%

AWS Rekognition

Age 30-40
Gender Male, 93.9%
Calm 100%
Sad 0%
Angry 0%
Surprised 0%
Disgusted 0%
Happy 0%
Confused 0%
Fear 0%

AWS Rekognition

Age 28-38
Gender Female, 66.3%
Calm 99.8%
Sad 0.2%
Happy 0%
Surprised 0%
Disgusted 0%
Angry 0%
Confused 0%
Fear 0%

AWS Rekognition

Age 27-37
Gender Male, 81.8%
Calm 89.4%
Confused 4.9%
Sad 4.8%
Happy 0.3%
Angry 0.2%
Disgusted 0.2%
Surprised 0.1%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%
Chair 96.9%
Shoe 95.7%
Suit 58.9%

Categories

Text analysis

Amazon

34966