Human Generated Data

Title

Untitled (women with coffee service)

Date

c. 1950

People

Artist: John Howell, American active 1930s-1960s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21564

Human Generated Data

Title

Untitled (women with coffee service)

People

Artist: John Howell, American active 1930s-1960s

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21564

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 99.8
Human 99.8
Clothing 99.4
Apparel 99.4
Person 99.4
Person 99.2
Female 94.6
Furniture 86
Face 86
Meal 83.1
Food 83.1
Woman 81.3
Table 77.8
Dress 77.4
Dish 77
Pottery 76.4
People 74.7
Vase 72.4
Jar 72.4
Suit 71.9
Coat 71.9
Overcoat 71.9
Girl 71.8
Shoe 71.5
Footwear 71.5
Chair 70.6
Potted Plant 69.9
Plant 69.9
Portrait 69.6
Photography 69.6
Photo 69.6
Flower 59.3
Blossom 59.3
Sitting 58.4

Clarifai
created on 2023-10-22

people 99.9
group 99.7
adult 98.7
woman 97.9
two 96.8
several 95.2
group together 94.5
three 93.8
man 93.7
many 93.3
elderly 93.2
five 91.7
leader 90.6
wear 89.8
four 89.6
facial expression 89.6
music 89.1
recreation 89
furniture 88.7
child 87.9

Imagga
created on 2022-03-05

man 37.6
person 35.3
patient 30.3
shower cap 30
cap 28.7
people 28.4
male 26.3
adult 23.7
kin 20.7
headdress 20.3
senior 19.7
sick person 18.5
case 18.4
clothing 18.1
men 18
home 17.5
happy 17.5
salon 16.1
couple 15.7
medical 15
elderly 14.4
portrait 14.2
worker 14.1
sitting 13.7
indoors 13.2
mature 13
lifestyle 13
hospital 12.7
work 12.7
love 12.6
nurse 12.5
health 12.5
working 12.4
care 12.3
smiling 12.3
doctor 12.2
cheerful 12.2
room 12.1
casual 11.9
old 11.8
happiness 11.7
retirement 11.5
smile 11.4
women 11.1
family 10.7
face 10.6
together 10.5
surgeon 10.3
human 9.7
medicine 9.7
husband 9.5
grandma 9.3
professional 9.2
equipment 9.2
hand 9.1
team 9
one 9
kitchen 8.9
uniform 8.8
retired 8.7
illness 8.6
wife 8.5
holding 8.2
20s 8.2
alone 8.2
clinic 7.8
surgery 7.8
mid adult 7.7
expression 7.7
married 7.7
togetherness 7.5
active 7.5
leisure 7.5
mask 7.4
occupation 7.3
grandfather 7.3
covering 7.3
lady 7.3
indoor 7.3
hat 7.3
office 7.2
computer 7.2
looking 7.2
consumer goods 7.2
hair 7.1

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

text 98.9
person 98.7
clothing 94.7
man 92.5
black and white 80.3
human face 65.5

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 34-42
Gender Male, 86.1%
Happy 96.9%
Surprised 0.9%
Fear 0.8%
Sad 0.6%
Calm 0.3%
Disgusted 0.2%
Angry 0.1%
Confused 0.1%

AWS Rekognition

Age 47-53
Gender Male, 99.7%
Happy 96.1%
Sad 1%
Surprised 0.7%
Confused 0.6%
Calm 0.6%
Disgusted 0.5%
Angry 0.3%
Fear 0.2%

AWS Rekognition

Age 33-41
Gender Male, 99.9%
Happy 30%
Calm 27.9%
Surprised 18.4%
Confused 15.4%
Disgusted 6.8%
Sad 0.7%
Angry 0.5%
Fear 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 99.8%
Person 99.4%
Person 99.2%

Categories

Text analysis

Amazon

M
113
٢ M 113 022HA
٢
022HA

Google

SMI3 YT3A2 002KA
SMI3
YT3A2
002KA