Human Generated Data

Title

Untitled (man fishing off dock with two young boys)

Date

c. 1950

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8834

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (man fishing off dock with two young boys)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8834

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 98.7
Human 98.7
Person 98.4
Person 98.2
Person 95.7
Clinic 84.1
Furniture 60.3
Chair 60.3
Hospital 59.1
Doctor 56.2
Operating Theatre 55.3

Clarifai
created on 2023-10-25

piano 99.8
people 99.7
man 99.2
music 98.7
musician 98.1
monochrome 98.1
adult 97.9
jazz 96.8
instrument 96.3
pianist 96.2
woman 94
group 93.9
indoors 93.6
two 92.3
child 89.4
furniture 88.2
art 85.3
sit 85.2
boy 83.7
rehearsal 83.6

Imagga
created on 2022-01-09

people 30.1
man 29.6
male 26.2
counter 25.2
person 23.7
business 21.3
adult 21.2
happy 18.2
office 18.1
lifestyle 16.6
men 16.3
work 15.8
smiling 15.2
businessman 15
monitor 15
billboard 14.3
women 14.2
sitting 13.7
corporate 12.9
technology 12.6
laptop 12
signboard 11.6
meeting 11.3
equipment 11.3
professional 11.1
casual 11
communication 10.9
businesswoman 10.9
smile 10.7
interior 10.6
dishwasher 10.6
standing 10.4
portrait 10.4
television 9.9
modern 9.8
structure 9.8
cheerful 9.8
job 9.7
computer 9.7
group 9.7
indoors 9.7
musical instrument 9.6
couple 9.6
electronic equipment 9.4
worker 9
chair 9
table 9
looking 8.8
happiness 8.6
two 8.5
percussion instrument 8.4
board 8.3
child 8.3
silhouette 8.3
human 8.2
team 8.1
success 8
working 8
black 7.9
elegant 7.7
pretty 7.7
white goods 7.7
talking 7.6
hand 7.6
career 7.6
room 7.6
togetherness 7.6
holding 7.4
blackboard 7.4
executive 7.3
lady 7.3
indoor 7.3
confident 7.3
home 7.2
classroom 7.2
day 7.1
home appliance 7.1
sky 7

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

text 99.6
black and white 95.1
person 91.1
clothing 77.3
piano 59.8
monochrome 58.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 23-33
Gender Female, 97.9%
Happy 46.3%
Calm 36%
Fear 12.4%
Surprised 1.6%
Sad 1.1%
Confused 1.1%
Disgusted 0.9%
Angry 0.6%

AWS Rekognition

Age 26-36
Gender Male, 62.3%
Sad 50.3%
Fear 27.8%
Happy 10.6%
Angry 4.2%
Calm 3.2%
Surprised 1.5%
Disgusted 1.5%
Confused 1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.7%

Categories

Text analysis

Amazon

39491B

Google

MJ--YT33A°2--AGON
MJ--YT33A°2--AGON