Human Generated Data

Title

Untitled (young men and women sitting on bench along wall)

Date

1937

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8199

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (young men and women sitting on bench along wall)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1937

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8199

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 99.1
Human 99.1
Person 98.8
Person 98.5
Person 98.4
Person 98.4
Person 97.8
Person 95.9
Person 92
Poster 82.8
Advertisement 82.8
Collage 78.3
Bird 76.4
Animal 76.4
Clothing 73.7
Apparel 73.7
Sunglasses 70.9
Accessories 70.9
Accessory 70.9
Art 67.3
Text 65.6
People 65.2
Person 63.8
Furniture 62.1
Person 50
Person 42.3

Clarifai
created on 2023-10-26

people 99.9
group 99.1
many 98.3
adult 98.2
group together 96.1
man 96.1
woman 94.3
education 93.4
wear 92.3
child 89.6
leader 86.6
several 85.9
music 85.6
outfit 84.3
no person 81.9
room 80.9
furniture 80.3
school 79.6
musician 79.2
uniform 77.8

Imagga
created on 2022-01-08

person 16.7
cockpit 16.3
man 16.1
people 15.6
male 14.9
center 13.1
power 12.6
vehicle 12.5
room 12.5
city 12.4
equipment 12.1
danger 10.9
safety 10.1
vacation 9.8
old 9.7
technology 9.6
industry 9.4
business 9.1
industrial 9.1
human 9
men 8.6
travel 8.4
world 8.3
house 8.3
device 8.2
retro 8.2
group 8
computer 8
antique 8
home 8
shop 7.9
adult 7.9
urban 7.8
boy 7.8
black 7.8
summer 7.7
hand 7.7
grunge 7.6
sport 7.6
finance 7.6
happy 7.5
dark 7.5
vintage 7.4
back 7.3
girls 7.3
transportation 7.2
family 7.1
work 7.1
businessman 7

Google
created on 2022-01-08

Picture frame 82.6
Suit 81.3
Font 80.6
Art 72.2
Illustration 65.8
Room 63.1
History 62.1
Monochrome 60.6
Team 59.6
Machine 58
Photo caption 54.3
Monochrome photography 53.8
Airplane 53.7
Chair 53.7
Pattern 53.4
Rectangle 52.8

Microsoft
created on 2022-01-08

text 99.1
clothing 88.6
person 85.7
drawing 65.3
cartoon 53.9
posing 39.3

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 2-8
Gender Male, 56.7%
Sad 71.6%
Calm 27.9%
Fear 0.1%
Disgusted 0.1%
Angry 0.1%
Happy 0.1%
Confused 0.1%
Surprised 0%

AWS Rekognition

Age 39-47
Gender Male, 99.8%
Calm 90%
Confused 3.4%
Happy 2.3%
Sad 2%
Disgusted 0.9%
Surprised 0.7%
Angry 0.4%
Fear 0.3%

AWS Rekognition

Age 31-41
Gender Female, 73.4%
Calm 93%
Happy 4.8%
Confused 0.7%
Disgusted 0.4%
Surprised 0.4%
Sad 0.4%
Angry 0.2%
Fear 0.1%

AWS Rekognition

Age 34-42
Gender Female, 59.8%
Calm 99.6%
Sad 0.2%
Confused 0.1%
Disgusted 0.1%
Angry 0%
Happy 0%
Surprised 0%
Fear 0%

AWS Rekognition

Age 23-31
Gender Male, 93.2%
Surprised 81.4%
Sad 9.1%
Calm 6.2%
Disgusted 0.8%
Fear 0.8%
Confused 0.8%
Happy 0.5%
Angry 0.4%

AWS Rekognition

Age 22-30
Gender Female, 88.3%
Calm 99.7%
Happy 0.1%
Sad 0.1%
Fear 0%
Disgusted 0%
Surprised 0%
Confused 0%
Angry 0%

Feature analysis

Amazon

Person 99.1%
Bird 76.4%
Sunglasses 70.9%

Categories

Captions

Microsoft
created on 2022-01-08

text 43.7%

Text analysis

Amazon

WOOD
jbea
wen
RL
Reminoton
STRAITIER
PEEZ
nsavino
HT
faon
faon 10
ils
10
TOMER

Google

Reinoto wen navino WOOD ibeal STRAITIFE
Reinoto
wen
navino
WOOD
ibeal
STRAITIFE