Human Generated Data

Title

Untitled (seated couple laughing with man)

Date

1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8462

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (seated couple laughing with man)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8462

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Person 99.6
Human 99.6
Person 99.5
Clinic 96
Person 94.2
Clothing 93.1
Apparel 93.1
Person 84.6
Sunglasses 81.9
Accessories 81.9
Accessory 81.9
Room 80.5
Indoors 80.5
Interior Design 79.8
Hospital 73.6
Doctor 72.8
People 63.9
Operating Theatre 63.8
Person 62.1
Furniture 61.6
Coat 61
Table 59.5
Chair 57.7

Clarifai
created on 2023-10-26

people 99.8
man 98.9
adult 98.8
group 97.9
indoors 93.9
group together 93.6
sit 93.1
woman 91.5
furniture 90.6
medical practitioner 90.4
chair 90.2
sitting 90
three 89.9
education 86.2
administration 86.1
four 85.5
five 85.5
several 82.6
wear 81.6
two 81.2

Imagga
created on 2022-01-15

person 35.8
people 32.9
man 28.9
male 22.7
life 17.7
adult 17.6
men 17.2
case 16.1
patient 15.7
lifestyle 14.4
hand 13.7
business 13.3
businessman 12.3
happy 11.9
equipment 11.7
teacher 11.5
room 11.1
sport 11
casual 11
board 11
leisure 10.8
professional 10.8
silhouette 10.7
monitor 10.6
sick person 10.5
fun 10.5
world 10.1
model 10.1
laptop 10
clothing 9.9
newspaper 9.8
human 9.7
black 9.6
nurse 9.6
women 9.5
portrait 9
fashion 9
blackboard 9
handsome 8.9
computer 8.9
group 8.9
education 8.6
day 8.6
face 8.5
player 8.3
musician 8.2
work 8.2
disk jockey 8
office 8
product 8
home 8
art 8
love 7.9
boy 7.8
travel 7.7
electronic equipment 7.7
youth 7.7
student 7.5
design 7.3
indoor 7.3
dress 7.2
music 7.2
team 7.2
job 7.1
happiness 7
sky 7

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

text 98.5
clothing 92.6
man 87.3
person 85.3
musical instrument 57.9

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 31-41
Gender Male, 99.9%
Sad 41.6%
Calm 34.7%
Surprised 10.5%
Fear 8.7%
Happy 1.5%
Angry 1.3%
Confused 1.1%
Disgusted 0.6%

AWS Rekognition

Age 34-42
Gender Male, 99.9%
Happy 84.4%
Sad 8.8%
Calm 2.5%
Surprised 1.2%
Confused 1%
Disgusted 0.9%
Fear 0.6%
Angry 0.6%

AWS Rekognition

Age 42-50
Gender Female, 92.9%
Happy 97.5%
Calm 1.5%
Surprised 0.4%
Sad 0.4%
Confused 0.1%
Fear 0.1%
Disgusted 0.1%
Angry 0.1%

AWS Rekognition

Age 37-45
Gender Male, 75.4%
Sad 82%
Calm 17%
Confused 0.4%
Fear 0.3%
Angry 0.1%
Disgusted 0.1%
Surprised 0.1%
Happy 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Sunglasses 81.9%
Chair 57.7%

Categories

Captions

Text analysis

Amazon

14509

Google

N509. 14509. 14509. NAGON-Y A MAMTZA3
N509.
14509.
NAGON-Y
A
MAMTZA3