Human Generated Data

Title

Untitled (young man and woman at dance)

Date

1965

People

Artist: Samuel Cooper, American active 1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19614

Human Generated Data

Title

Untitled (young man and woman at dance)

People

Artist: Samuel Cooper, American active 1950s

Date

1965

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19614

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Clothing 99.9
Apparel 99.9
Person 99.2
Human 99.2
Person 99
Furniture 98.5
Person 97.6
Chair 92.2
Suit 90.8
Overcoat 90.8
Coat 90.8
Robe 90.8
Fashion 90.8
Gown 87
Dress 85
Wedding 81.6
Sleeve 80.8
Bridegroom 77.8
Female 77.3
Indoors 75.1
Sunglasses 74.9
Accessories 74.9
Accessory 74.9
Wedding Gown 74.7
Person 72.3
Home Decor 71.7
Face 69.6
Portrait 67.1
Photography 67.1
Photo 67.1
Woman 62.9
Man 62.4
Floor 60.8
Person 60.4
Flooring 58.1
Bride 58
Long Sleeve 57.3
Shoe 56.9
Footwear 56.9
Room 55.1
Shoe 54.9

Clarifai
created on 2023-10-22

people 99.6
group 99
man 97.7
group together 97.1
adult 96.2
woman 95.4
wear 90.2
three 84.4
leader 83.3
actor 81.7
four 80.9
five 78.7
many 78
handshake 77.4
two 76.6
several 75.4
family 74.1
dancing 71.4
wedding 70
chair 67.9

Imagga
created on 2022-03-05

crutch 42.5
staff 34.8
people 34.6
person 28.3
stick 28.1
man 27.5
nurse 26.8
adult 19.9
male 19.8
men 17.2
women 15
active 14.7
outdoors 14.2
snow 13.8
business 13.4
walking 13.3
winter 12.8
human 12.7
businessman 12.4
fashion 12.1
brass 12
couple 11.3
looking 11.2
professional 11.1
beach 11
patient 10.9
sport 10.9
clothing 10.7
happy 10.6
travel 10.6
group 10.5
health 10.4
senior 10.3
lifestyle 10.1
suit 10
mountain 9.8
portrait 9.7
success 9.6
together 9.6
sand 9.6
standing 9.6
cold 9.5
wall 9.4
outdoor 9.2
leisure 9.1
old 9.1
dress 9
grandfather 9
vacation 9
fun 9
activity 9
teacher 8.7
wind instrument 8.7
corporate 8.6
black 8.4
life 8.3
exercise 8.2
child 8
smiling 8
indoors 7.9
holiday 7.9
happiness 7.8
clothes 7.5
room 7.3
team 7.2
love 7.1
summer 7.1
work 7.1
medical 7.1
day 7.1

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

person 96.7
outdoor 92.4
text 89
clothing 78.5
black and white 74.1
man 52

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 36-44
Gender Female, 96.5%
Calm 42%
Surprised 24.1%
Happy 17.5%
Confused 5%
Disgusted 4.8%
Sad 4.3%
Fear 1.2%
Angry 1.1%

AWS Rekognition

Age 43-51
Gender Female, 99.4%
Happy 71.8%
Angry 16.1%
Sad 3.3%
Surprised 2.7%
Calm 2.1%
Fear 1.9%
Confused 1.1%
Disgusted 1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Chair
Sunglasses
Shoe
Person 99.2%
Person 99%
Person 97.6%
Person 72.3%
Person 60.4%
Chair 92.2%
Sunglasses 74.9%
Shoe 56.9%
Shoe 54.9%

Categories

Text analysis

Amazon

2
3 2
3
23
ЧАС 3
23 YT3RA®2
y
ЧАС
YT3RA®2

Google

2 3 19 3 MJ YT A°2NAC
2
3
19
MJ
YT
A°2NAC