Human Generated Data

Title

Untitled (guests at party, sitting)

Date

1965

People

Artist: Robert Burian, American active 1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19333

Human Generated Data

Title

Untitled (guests at party, sitting)

People

Artist: Robert Burian, American active 1940s-1950s

Date

1965

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19333

Machine Generated Data

Tags

Amazon
created on 2022-02-25

Person 99
Human 99
Person 98.1
Shoe 97.8
Clothing 97.8
Footwear 97.8
Apparel 97.8
Chair 97.2
Furniture 97.2
Sitting 96.4
Person 96.4
Person 95.6
Person 89.9
Person 88.7
Suit 73
Coat 73
Overcoat 73
Suit 68.7
Suit 68.1
Flooring 67
Leisure Activities 61.6
Floor 60.6
Pants 58.4

Clarifai
created on 2023-10-22

people 99.8
man 98.7
chair 98
adult 97.9
woman 97.7
group 96.7
two 96.6
furniture 95.9
indoors 95.9
room 94.1
seat 90.6
sit 90.5
music 88.8
group together 88.1
actor 87.9
leader 87.4
three 87.2
portrait 87.1
family 85.3
sitting 84.4

Imagga
created on 2022-02-25

oboe 35.8
business 35.2
people 35.1
man 34.9
person 30.3
businesswoman 30
male 29.8
computer 28.3
work 28.2
businessman 28.2
adult 27.9
laptop 27.8
group 27.4
office 27
professional 25.1
men 23.2
wind instrument 23.1
women 22.9
meeting 22.6
corporate 22.3
musical instrument 22
teacher 22
team 21.5
communication 20.1
chair 20.1
executive 19.8
sitting 19.7
teamwork 19.5
working 18.6
table 18.2
success 17.7
happy 17.5
smiling 17.4
classroom 16.8
smile 16.4
woodwind 16.3
businesspeople 16.1
job 15.9
room 15.9
together 15.8
technology 14.8
suit 14.4
student 14.3
outfit 14
attractive 14
education 13.8
lifestyle 13.7
two 13.5
worker 13.4
couple 13.1
looking 12.8
notebook 12.4
indoors 12.3
manager 12.1
modern 11.9
casual 11.9
discussion 11.7
handsome 11.6
talking 11.4
cheerful 11.4
educator 11.2
occupation 11
confident 10.9
desk 10.4
black 10.2
successful 10.1
indoor 10
pretty 9.8
boss 9.6
study 9.3
girls 9.1
accordion 9.1
portrait 9.1
conference 8.8
students 8.8
colleagues 8.7
employee 8.7
busy 8.7
friends 8.5
learning 8.5
planner 8.4
coffee 8.3
brass 8.3
holding 8.3
20s 8.2
keyboard instrument 8.2
board 8.1
copy space 8.1
chatting 7.8
happiness 7.8
full length 7.8
corporation 7.7
class 7.7
diversity 7.7
city 7.5
presentation 7.4

Google
created on 2022-02-25

Microsoft
created on 2022-02-25

person 97.4
woman 93.3
text 91.9
clothing 83.9
smile 78
man 66.6
posing 60.7
footwear 60
piano 53.1
female 26.2

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 16-24
Gender Male, 57.1%
Calm 98.2%
Confused 0.5%
Surprised 0.3%
Fear 0.3%
Disgusted 0.3%
Angry 0.3%
Sad 0.2%
Happy 0%

AWS Rekognition

Age 18-26
Gender Male, 99.9%
Calm 69.3%
Angry 17.8%
Disgusted 6%
Confused 3.3%
Sad 1.2%
Fear 0.9%
Surprised 0.8%
Happy 0.7%

AWS Rekognition

Age 21-29
Gender Female, 100%
Happy 66.6%
Calm 27.3%
Confused 2.3%
Angry 1.3%
Surprised 1.2%
Sad 0.7%
Fear 0.3%
Disgusted 0.3%

AWS Rekognition

Age 51-59
Gender Male, 98.7%
Calm 96.8%
Angry 1.9%
Confused 0.7%
Surprised 0.2%
Disgusted 0.2%
Sad 0.1%
Fear 0%
Happy 0%

AWS Rekognition

Age 12-20
Gender Female, 85%
Happy 58.2%
Surprised 24.1%
Confused 7.6%
Calm 3.9%
Fear 2.7%
Angry 1.7%
Sad 1.1%
Disgusted 0.8%

AWS Rekognition

Age 7-17
Gender Female, 99.9%
Calm 91.1%
Surprised 3%
Sad 2.6%
Confused 0.9%
Fear 0.9%
Disgusted 0.7%
Happy 0.5%
Angry 0.3%

AWS Rekognition

Age 23-31
Gender Female, 100%
Surprised 69.7%
Fear 12.1%
Confused 6.4%
Calm 4.9%
Happy 3.5%
Angry 1.7%
Sad 1%
Disgusted 0.8%

Microsoft Cognitive Services

Age 23
Gender Female

Microsoft Cognitive Services

Age 4
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Feature analysis

Amazon

Person
Shoe
Chair
Suit
Person 99%
Person 98.1%
Person 96.4%
Person 95.6%
Person 89.9%
Person 88.7%
Shoe 97.8%
Chair 97.2%
Suit 73%
Suit 68.7%
Suit 68.1%

Text analysis

Amazon

65
JAN

Google

JAN 65
JAN
65