Human Generated Data

Title

Untitled (group portrait of women's club)

Date

1974

People

Artist: Ken Whitmire Associates, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19794

Human Generated Data

Title

Untitled (group portrait of women's club)

People

Artist: Ken Whitmire Associates, American

Date

1974

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19794

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 98.9
Human 98.9
Person 98.8
Furniture 98.6
Person 97.8
Person 97.8
Person 97.6
Person 97.4
Person 97.3
Person 97
Person 96.4
Person 95.4
Person 95
Person 94.3
Person 91.9
Person 88.1
Indoors 88
Room 87.7
Person 80.6
Classroom 79.7
School 79.7
Crowd 74.8
Table 72.5
Clothing 72.2
Apparel 72.2
Shoe 71
Footwear 71
Interior Design 70.1
Flag 67.2
Symbol 67.2
Stage 66.9
People 66.2
Floor 62.3
Photography 60.6
Photo 60.6
Portrait 59.6
Face 59.6
Wood 58.3
Audience 57.3
Chair 55.6
Shoe 55
Shoe 52.9

Clarifai
created on 2023-10-22

people 99.8
group 99.6
group together 99.3
many 98
woman 97
child 95.8
man 95.8
education 95
adult 94.8
music 91.7
several 90.6
school 89.8
five 86.9
class 84.6
family 83.9
leader 82.2
teacher 81.5
adolescent 80.6
recreation 77.6
wear 76.7

Imagga
created on 2022-03-05

person 41
people 35.1
man 26.9
golfer 26.1
adult 25.2
player 21.1
women 20.5
group 20.1
happy 20
male 18.4
contestant 17.3
lifestyle 16.6
silhouette 16.5
dancer 16.3
men 16.3
sport 15.9
kin 15.7
couple 15.7
portrait 15.5
teacher 15
businessman 14.1
happiness 14.1
together 14
smiling 13.7
body 13.6
professional 13.1
fashion 12.8
fun 12.7
outdoors 12.7
performer 12.7
pretty 12.6
attractive 12.6
team 12.5
model 12.4
business 11.5
active 11.5
athlete 11.4
sexy 11.2
sitting 11.2
classroom 11.1
child 11.1
sunset 10.8
smile 10.7
crowd 10.6
lady 10.5
human 10.5
success 10.5
style 10.4
casual 10.2
training 10.2
beach 10.1
water 10
exercise 10
educator 9.9
fitness 9.9
gymnasium 9.8
boy 9.6
clothing 9.5
work 9.4
runner 9.3
dark 9.2
sibling 9
family 8.9
room 8.7
run 8.7
love 8.7
holiday 8.6
walking 8.5
passion 8.5
friends 8.4
friendship 8.4
student 8.4
summer 8.4
competition 8.2
sensuality 8.2
dress 8.1
entertainer 8.1
sun 8
dance 7.9
black 7.9
athletic facility 7.8
running 7.7
sky 7.6
health 7.6
two 7.6
leisure 7.5
one 7.5
sensual 7.3
posing 7.1

Google
created on 2022-03-05

Style 83.8
Font 80.1
Adaptation 79.3
Art 73.8
Event 72
Monochrome 72
Monochrome photography 70.1
Vintage clothing 69.1
Room 69.1
Chair 68.5
Visual arts 64.4
Sitting 62.4
Photo caption 62.3
Illustration 60.5
Child 58.1
Team 56.8
Class 55.2
Painting 54.9
Fun 51.5
Family 51.3

Microsoft
created on 2022-03-05

person 97.8
clothing 97.7
outdoor 90.8
footwear 89.7
woman 84.9
court 83.1
text 81.9
man 77.9
posing 39.9

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 51-59
Gender Female, 61.1%
Sad 98.2%
Fear 0.6%
Calm 0.3%
Confused 0.2%
Angry 0.2%
Happy 0.2%
Disgusted 0.2%
Surprised 0.1%

AWS Rekognition

Age 45-51
Gender Female, 56.3%
Calm 99.1%
Happy 0.3%
Sad 0.3%
Disgusted 0.1%
Angry 0.1%
Confused 0%
Fear 0%
Surprised 0%

AWS Rekognition

Age 48-54
Gender Female, 68.2%
Calm 97.2%
Sad 1.5%
Confused 0.5%
Surprised 0.4%
Disgusted 0.1%
Happy 0.1%
Fear 0.1%
Angry 0.1%

AWS Rekognition

Age 52-60
Gender Female, 90.6%
Disgusted 51.3%
Happy 24.6%
Calm 7.7%
Surprised 5.8%
Fear 3.6%
Angry 2.9%
Confused 2.4%
Sad 1.7%

AWS Rekognition

Age 42-50
Gender Female, 62.4%
Calm 96.8%
Happy 3%
Sad 0.1%
Surprised 0%
Confused 0%
Disgusted 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 45-51
Gender Male, 99.6%
Calm 80.2%
Disgusted 7.6%
Surprised 5.5%
Confused 3.1%
Sad 1.2%
Fear 1%
Happy 0.7%
Angry 0.7%

AWS Rekognition

Age 49-57
Gender Male, 94%
Calm 88.5%
Happy 8.4%
Sad 1.1%
Confused 0.6%
Disgusted 0.5%
Angry 0.4%
Fear 0.3%
Surprised 0.2%

AWS Rekognition

Age 45-53
Gender Male, 99.9%
Calm 61.4%
Happy 28%
Sad 5.6%
Confused 1.3%
Disgusted 1.3%
Angry 1.3%
Surprised 0.8%
Fear 0.4%

AWS Rekognition

Age 35-43
Gender Male, 87.4%
Calm 67%
Sad 10.6%
Happy 6.8%
Confused 6.3%
Disgusted 4.9%
Angry 2.1%
Surprised 1.5%
Fear 0.9%

AWS Rekognition

Age 45-53
Gender Male, 87.2%
Calm 80.7%
Happy 12%
Sad 3.1%
Confused 1.4%
Fear 1%
Disgusted 0.7%
Angry 0.7%
Surprised 0.5%

AWS Rekognition

Age 34-42
Gender Female, 99.4%
Happy 87.8%
Calm 5.3%
Sad 5.1%
Confused 0.6%
Disgusted 0.3%
Surprised 0.3%
Angry 0.3%
Fear 0.3%

AWS Rekognition

Age 41-49
Gender Male, 90.7%
Sad 80.5%
Calm 8.2%
Happy 3.7%
Fear 2.1%
Angry 1.8%
Confused 1.7%
Surprised 1.1%
Disgusted 1%

AWS Rekognition

Age 27-37
Gender Female, 99.6%
Calm 96.1%
Sad 2.2%
Fear 0.5%
Confused 0.4%
Angry 0.2%
Surprised 0.2%
Happy 0.2%
Disgusted 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Shoe
Chair
Person 98.9%
Person 98.8%
Person 97.8%
Person 97.8%
Person 97.6%
Person 97.4%
Person 97.3%
Person 97%
Person 96.4%
Person 95.4%
Person 95%
Person 94.3%
Person 91.9%
Person 88.1%
Person 80.6%
Shoe 71%
Shoe 55%
Shoe 52.9%
Chair 55.6%

Text analysis

Amazon

9
A
765
R
8-3
040
2