Human Generated Data

Title

Untitled (two men and woman at party)

Date

1950

People

Artist: Peter James Studio, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.20024

Human Generated Data

Title

Untitled (two men and woman at party)

People

Artist: Peter James Studio, American

Date

1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.20024

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Clothing 99.6
Apparel 99.6
Person 99.3
Human 99.3
Person 96.6
Person 96.4
Suit 93.6
Coat 93.6
Overcoat 93.6
Sleeve 80.9
Sunglasses 71.1
Accessories 71.1
Accessory 71.1
Finger 69
Shirt 57.8
Long Sleeve 56.2

Clarifai
created on 2023-10-22

people 99.8
monochrome 99.5
man 98.7
portrait 98
adult 97.5
two 95.7
group 95
wedding 94.5
profile 92.1
couple 89.9
three 89.1
black and white 86.4
music 86.1
side view 85.5
woman 85.3
musician 85.2
actor 81.7
scientist 80.6
art 79
science 78.2

Imagga
created on 2022-03-05

man 45
senior 40.3
male 35.4
person 35.1
people 30.7
portrait 26.5
adult 25.9
elderly 25.8
old 25.1
couple 24.4
happy 23.8
mature 23.2
retired 21.3
businessman 20.3
executive 20.1
business 19.4
handsome 18.7
looking 16.8
together 16.6
professional 16.6
retirement 16.3
face 15.6
suit 15.5
gray 15.3
office 15.3
love 15
glasses 14.8
specialist 14.3
men 13.7
husband 13.5
health 13.2
occupation 12.8
aged 12.7
work 12.5
medical 12.3
wife 12.3
lifestyle 12.3
grandfather 12
computer 12
sitting 12
laptop 12
human 12
hair 11.9
citizen 11.8
older 11.6
smiling 11.6
married 11.5
smile 11.4
grandma 11.3
corporate 11.2
patient 11.1
casual 11
device 10.9
hand 10.6
indoors 10.5
age 10.5
home 10.4
black 10.2
pensioner 10.1
room 10.1
clothing 9.8
modern 9.8
attractive 9.8
family 9.8
worker 9.8
look 9.6
tie 9.5
desk 9.4
happiness 9.4
doctor 9.4
nurse 9.2
indoor 9.1
care 9
outdoors 8.9
lady 8.9
60s 8.8
whistle 8.7
busy 8.7
businesspeople 8.5
manager 8.4
success 8
jacket 8
working 7.9
medicine 7.9
grandmother 7.8
stethoscope 7.8
expertise 7.8
surgeon 7.7
pretty 7.7
expression 7.7
enjoying 7.6
head 7.6
communication 7.6
hospital 7.5
leisure 7.5
holding 7.4
acoustic device 7.4
beard 7.2
romance 7.1
romantic 7.1
women 7.1
job 7.1

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

person 98.5
human face 97.3
clothing 95.3
text 94.5
man 90.7
black and white 82.4

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 23-31
Gender Male, 98.5%
Calm 99.9%
Angry 0%
Sad 0%
Happy 0%
Surprised 0%
Confused 0%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 29-39
Gender Female, 91.8%
Happy 58.1%
Calm 15.4%
Surprised 8.7%
Confused 6.8%
Sad 5.6%
Disgusted 2.6%
Fear 1.6%
Angry 1.3%

AWS Rekognition

Age 23-31
Gender Female, 91.5%
Calm 95.9%
Sad 2.3%
Surprised 0.9%
Disgusted 0.3%
Happy 0.2%
Fear 0.2%
Angry 0.1%
Confused 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Suit
Sunglasses
Person 99.3%
Person 96.6%
Person 96.4%
Suit 93.6%
Sunglasses 71.1%

Text analysis

Amazon

DO
KODVK
EVERLA