Human Generated Data

Title

Untitled (woman and children in family room)

Date

c. 1950

People

Artist: John Howell, American active 1930s-1960s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21547

Human Generated Data

Title

Untitled (woman and children in family room)

People

Artist: John Howell, American active 1930s-1960s

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21547

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Shorts 99.8
Clothing 99.8
Apparel 99.8
Person 99.7
Human 99.7
Person 99.6
Person 99.3
Person 99
Person 98.9
Person 97.6
Person 97.5
Helmet 95.4
Person 91.7
Chair 86.9
Furniture 86.9
People 86.8
Kid 77.5
Child 77.5
Face 76.8
Indoors 76.8
Female 74
Suit 66.8
Coat 66.8
Overcoat 66.8
Sport 66.6
Sports 66.6
Girl 65.2
Boy 64.1
Play 62.9
Photography 62.2
Photo 62.2
Shoe 61.5
Footwear 61.5
Couch 59.7
Smile 58
Room 57.1
Bed 56.3
Fitness 55
Working Out 55
Exercise 55

Clarifai
created on 2023-10-22

people 99.9
child 99.9
group 99.8
group together 98.8
boy 97.1
adult 97
man 96.6
education 96.1
woman 95.9
many 95.8
son 94.3
sibling 93.7
school 93.7
family 92.5
several 90.8
teacher 89.2
recreation 88.7
adolescent 88
outfit 87.2
portrait 86.8

Imagga
created on 2022-03-05

kin 42.3
sport 29
sibling 28.4
man 28.2
person 21.3
athlete 20.9
people 20.6
male 20.6
world 19.9
ball 18.5
adult 17.8
lifestyle 17.3
exercise 17.2
active 17.2
player 17
competition 16.5
black 16.2
court 14.6
tennis 13.6
fun 12.7
fitness 12.6
child 12.5
outside 12
teacher 11.4
happy 11.3
outdoors 11.2
action 11.1
training 11.1
youth 11.1
silhouette 10.8
recreation 10.7
backboard 10.7
run 10.6
group 10.5
equipment 10.5
body 10.4
play 10.3
men 10.3
summer 10.3
motion 10.3
casual 10.2
leisure 10
outdoor 9.9
team 9.8
attractive 9.8
style 9.6
couple 9.6
boy 9.6
legs 9.4
modern 9.1
portrait 8.4
art 8.4
racket 8.4
lady 8.1
love 7.9
happiness 7.8
clothing 7.8
sitting 7.7
jump 7.7
healthy 7.6
fashion 7.5
one 7.5
event 7.4
freedom 7.3
playing 7.3
business 7.3
smiling 7.2
smile 7.1
planner 7.1
attire 7

Google
created on 2022-03-05

Shorts 90.6
Black-and-white 85
Style 84.1
T-shirt 78.8
Art 78.6
Font 77.2
Music 76.1
Monochrome 75.1
Monochrome photography 74.9
Lamp 73.9
Event 72.5
Fun 72
Crew 68.3
Musical instrument 67.9
Team 67.3
Room 65.9
Sitting 60.7
Recreation 58.7
Musician 57.8
Entertainment 56.4

Microsoft
created on 2022-03-05

person 96.4
text 93
clothing 85.5
posing 59.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 35-43
Gender Male, 99.6%
Calm 47%
Surprised 24.8%
Fear 12.8%
Happy 7.5%
Angry 3.2%
Sad 2.5%
Disgusted 1.5%
Confused 0.8%

AWS Rekognition

Age 23-31
Gender Male, 96.1%
Sad 72.1%
Happy 13.6%
Calm 10.7%
Surprised 1.9%
Confused 0.5%
Fear 0.5%
Angry 0.4%
Disgusted 0.3%

AWS Rekognition

Age 28-38
Gender Male, 99.5%
Calm 96.5%
Surprised 0.8%
Happy 0.8%
Confused 0.5%
Fear 0.5%
Disgusted 0.4%
Sad 0.4%
Angry 0.2%

AWS Rekognition

Age 28-38
Gender Male, 98.4%
Calm 33.1%
Happy 27.5%
Surprised 20.5%
Sad 12.6%
Fear 2.4%
Disgusted 1.7%
Angry 1.2%
Confused 1%

AWS Rekognition

Age 31-41
Gender Female, 53.9%
Happy 99.5%
Calm 0.2%
Surprised 0.2%
Disgusted 0%
Fear 0%
Angry 0%
Sad 0%
Confused 0%

AWS Rekognition

Age 35-43
Gender Male, 99%
Happy 68.7%
Calm 25.5%
Surprised 3.5%
Confused 0.7%
Sad 0.6%
Disgusted 0.4%
Angry 0.3%
Fear 0.2%

AWS Rekognition

Age 33-41
Gender Male, 99.4%
Surprised 52.8%
Happy 46.4%
Disgusted 0.2%
Calm 0.1%
Angry 0.1%
Fear 0.1%
Sad 0.1%
Confused 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Helmet
Shoe
Person 99.7%
Person 99.6%
Person 99.3%
Person 99%
Person 98.9%
Person 97.6%
Person 97.5%
Person 91.7%
Helmet 95.4%
Shoe 61.5%

Text analysis

Amazon

so
KODAK-ТA