Human Generated Data

Title

Untitled (men and women surrounding table of fish)

Date

c. 1950

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4469

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (men and women surrounding table of fish)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4469

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99.6
Human 99.6
Person 99.5
Person 98.9
Person 85.6
People 70.5
Workshop 70.1
Building 66.8
Car 62.6
Transportation 62.6
Vehicle 62.6
Automobile 62.6
Tabletop 58.6
Furniture 58.6
Bench 58.4
Factory 57.5
Sunglasses 57.3
Accessories 57.3
Accessory 57.3
Table 56.3

Clarifai
created on 2023-10-26

people 99.7
adult 97.5
man 95.4
group 94.8
woman 94.3
child 90.7
monochrome 90.2
sit 89.2
retro 83.7
administration 83.2
education 82.3
group together 81.1
vintage 80.1
chair 79.9
wear 79.4
paper 79
nostalgia 78.7
war 78.4
boy 77.6
musician 76.7

Imagga
created on 2022-01-23

negative 77.3
film 61.1
photographic paper 46.5
newspaper 45.7
product 35.9
daily 31.5
photographic equipment 31
creation 27.1
people 17.8
person 17.1
work 15.1
man 14.8
male 14.2
sitting 13.7
business 13.4
lifestyle 11.6
chair 11.3
men 11.2
businessman 10.6
modern 10.5
human 10.5
portrait 10.3
leisure 10
team 9.9
equipment 9.8
adult 9.7
snow 9.7
looking 9.6
day 9.4
youth 9.4
hand 9.1
home 8.8
happy 8.8
architecture 8.6
winter 8.5
casual 8.5
relax 8.4
worker 8
working 8
building 7.9
urban 7.9
scene 7.8
cold 7.7
culture 7.7
sky 7.6
professional 7.6
house 7.5
outdoors 7.5
holding 7.4
technology 7.4
teamwork 7.4
room 7.4
businesswoman 7.3
group 7.2
music 7.2
black 7.2
clothing 7.2
smile 7.1
women 7.1
cool 7.1
travel 7

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

text 99.9
book 98.2
person 95.2
clothing 95.1
furniture 92.8
drawing 88.5
man 88.4
table 88.1
footwear 61
old 57.1

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 22-30
Gender Female, 90%
Calm 93.2%
Sad 1.4%
Fear 1.4%
Happy 1.2%
Surprised 1.2%
Disgusted 0.8%
Angry 0.7%
Confused 0.3%

AWS Rekognition

Age 24-34
Gender Female, 97.8%
Calm 89.2%
Happy 4.4%
Sad 2.4%
Confused 2.2%
Surprised 0.6%
Angry 0.5%
Disgusted 0.4%
Fear 0.4%

Feature analysis

Amazon

Person 99.6%
Car 62.6%
Bench 58.4%
Sunglasses 57.3%

Categories

Imagga

paintings art 99.2%

Captions

Microsoft
created on 2022-01-23

an old photo of a person 65.4%
old photo of a person 62.6%

Text analysis

Amazon

THE
38755
SCHOOL
WILL
FISHING SCHOOL
WHO WILL BE THE
FISHING
BE
NATIONALLY
WHO
AND
AND ADV
BLACKYS
ADV
RODVR
-
KNOWN
Queenir FISHING
C.VEETA
Queenir

Google

WHO WILL BE THE Queeni FiSHInG BLACKYS FISHING SCHOOL BATIONALLY 38755
WHO
BE
Queeni
BLACKYS
SCHOOL
38755
WILL
THE
FiSHInG
FISHING
BATIONALLY