Human Generated Data

Title

Untitled (young men and women at tables at ball )

Date

1962

People

Artist: Robert Burian, American active 1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19176

Human Generated Data

Title

Untitled (young men and women at tables at ball )

People

Artist: Robert Burian, American active 1940s-1950s

Date

1962

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19176

Machine Generated Data

Tags

Amazon
created on 2022-02-25

Person 99.4
Human 99.4
Person 98.7
Person 98.4
Person 97.9
Person 95.4
Person 93.8
Restaurant 93.4
Person 92.6
Person 85.8
Person 85
Person 84.7
Meal 82.9
Food 82.9
Person 82.1
Suit 80.2
Coat 80.2
Overcoat 80.2
Clothing 80.2
Apparel 80.2
Person 78.4
Crowd 77.1
People 74.4
Food Court 64.8
Table 64.7
Furniture 64.7
Pub 63.8
Poster 63.5
Advertisement 63.5
Room 62.4
Indoors 62.4
Text 61.6
Bar Counter 61
Dining Table 59.4
Cafeteria 58.1
Portrait 55.9
Photography 55.9
Face 55.9
Photo 55.9

Clarifai
created on 2023-10-22

people 99.9
group 99.6
many 98.8
adult 97.8
group together 97.3
man 96.9
woman 96.8
administration 94.9
leader 94.5
several 91.1
chair 88.1
wear 86.4
furniture 85.9
audience 83.9
recreation 82.2
military 81.2
actor 80.4
war 80.3
music 79.4
sit 79.3

Imagga
created on 2022-02-25

man 29.7
business 25.5
people 24.5
person 24.4
male 24.2
office 21.2
businessman 21.2
group 20.1
professional 19.9
team 19.7
classroom 18
teacher 16.5
world 16.2
adult 16.1
work 15.7
happy 15
corporate 14.6
student 14.1
meeting 14.1
success 13.7
portrait 13.6
silhouette 13.2
education 13
job 12.4
teamwork 12
indoor 11.9
businesswoman 11.8
smiling 11.6
newspaper 11.4
couple 11.3
suit 10.8
smile 10.7
financial 10.7
desk 10.5
child 10.4
black 10.3
school 10.3
men 10.3
manager 10.2
communication 10.1
creation 9.7
working 9.7
colleagues 9.7
boy 9.6
envelope 9.5
businesspeople 9.5
executive 9.4
finance 9.3
groom 9.3
glasses 9.3
face 9.2
product 9.2
room 9.2
paper 8.8
partner 8.7
happiness 8.6
college 8.5
money 8.5
entrepreneur 8.4
presentation 8.4
sport 8.3
technology 8.2
worker 8.1
women 7.9
together 7.9
table 7.9
class 7.7
modern 7.7
card 7.6
tie 7.6
vintage 7.5
fun 7.5
letter 7.3
occupation 7.3
aged 7.2
blackboard 7.1
indoors 7

Google
created on 2022-02-25

Microsoft
created on 2022-02-25

text 99.8
person 96.8
clothing 94.1
man 92.6
black 78.6
human face 73.6
posing 71.2

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 19-27
Gender Male, 86.6%
Calm 99.7%
Confused 0.1%
Surprised 0.1%
Sad 0.1%
Angry 0%
Happy 0%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 30-40
Gender Male, 99.9%
Calm 67.5%
Angry 13.3%
Sad 10.9%
Confused 2.8%
Disgusted 2.2%
Fear 2.1%
Surprised 0.9%
Happy 0.3%

AWS Rekognition

Age 31-41
Gender Female, 99.9%
Calm 66.3%
Confused 17%
Angry 10.6%
Surprised 2.7%
Sad 1.2%
Disgusted 0.8%
Fear 0.7%
Happy 0.7%

AWS Rekognition

Age 23-33
Gender Female, 99.9%
Surprised 36.4%
Happy 21.3%
Angry 15.3%
Disgusted 9.3%
Calm 8.2%
Fear 3.8%
Sad 3.2%
Confused 2.5%

AWS Rekognition

Age 22-30
Gender Male, 98.2%
Confused 41.3%
Angry 27.9%
Calm 22.4%
Sad 6.6%
Fear 0.7%
Surprised 0.6%
Disgusted 0.4%
Happy 0.1%

AWS Rekognition

Age 19-27
Gender Male, 98.1%
Sad 48.4%
Calm 31.5%
Confused 9.2%
Surprised 6.4%
Disgusted 1.5%
Angry 1.3%
Fear 1%
Happy 0.9%

AWS Rekognition

Age 6-14
Gender Female, 100%
Surprised 64.4%
Calm 14.3%
Happy 9.5%
Angry 3.1%
Confused 3.1%
Fear 3%
Sad 1.5%
Disgusted 1.1%

AWS Rekognition

Age 21-29
Gender Male, 86.6%
Angry 39%
Calm 22.6%
Disgusted 16.1%
Happy 8.9%
Surprised 8.1%
Confused 2.2%
Sad 1.6%
Fear 1.5%

AWS Rekognition

Age 26-36
Gender Male, 98.8%
Happy 91.3%
Calm 8%
Disgusted 0.4%
Surprised 0.1%
Sad 0.1%
Fear 0.1%
Angry 0%
Confused 0%

AWS Rekognition

Age 20-28
Gender Male, 99.8%
Happy 81.3%
Calm 8.3%
Disgusted 3.6%
Angry 3.5%
Surprised 1.5%
Sad 0.6%
Confused 0.6%
Fear 0.4%

AWS Rekognition

Age 28-38
Gender Male, 89.1%
Happy 91.3%
Calm 5.7%
Surprised 1.1%
Disgusted 0.6%
Sad 0.4%
Angry 0.4%
Confused 0.3%
Fear 0.1%

AWS Rekognition

Age 21-29
Gender Male, 99.8%
Sad 99.1%
Disgusted 0.6%
Calm 0.1%
Confused 0.1%
Fear 0.1%
Surprised 0%
Angry 0%
Happy 0%

AWS Rekognition

Age 23-31
Gender Male, 100%
Sad 82.7%
Calm 10%
Angry 2.7%
Surprised 1.5%
Confused 1.2%
Fear 0.9%
Disgusted 0.8%
Happy 0.2%

AWS Rekognition

Age 22-30
Gender Female, 50.7%
Sad 99.1%
Calm 0.4%
Confused 0.1%
Surprised 0.1%
Fear 0.1%
Angry 0.1%
Disgusted 0%
Happy 0%

Microsoft Cognitive Services

Age 23
Gender Male

Microsoft Cognitive Services

Age 31
Gender Female

Microsoft Cognitive Services

Age 20
Gender Female

Microsoft Cognitive Services

Age 39
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Feature analysis

Amazon

Person
Poster
Person 99.4%
Person 98.7%
Person 98.4%
Person 97.9%
Person 95.4%
Person 93.8%
Person 92.6%
Person 85.8%
Person 85%
Person 84.7%
Person 82.1%
Person 78.4%
Poster 63.5%

Categories

Text analysis

Amazon

62
DEC
129
129 +24
+24

Google

DEC 62 129 124
DEC
62
129
124