Human Generated Data

Title

Untitled (people at table at party)

Date

c. 1966

People

Artist: Robert Burian, American active 1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19245

Human Generated Data

Title

Untitled (people at table at party)

People

Artist: Robert Burian, American active 1940s-1950s

Date

c. 1966

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19245

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Chair 99.9
Furniture 99.9
Chair 99.5
Person 99.2
Human 99.2
Person 98.1
Person 97.6
Person 97.6
Person 97.3
Sitting 94.9
Chair 93.9
Room 93.7
Indoors 93.7
Person 93.7
Table 82.6
Dining Table 75.5
Tie 68.8
Accessories 68.8
Accessory 68.8
Crowd 65.1
People 63.7
Meal 62.9
Food 62.9
Suit 58.9
Clothing 58.9
Coat 58.9
Overcoat 58.9
Apparel 58.9
Dish 58.3
Restaurant 58.2
Senior Citizen 56.7
Home Decor 56.1
Dining Room 55.6

Clarifai
created on 2023-10-22

people 99.7
man 97.9
group 97.9
woman 97.4
adult 97.2
group together 95.3
chair 90.1
sit 90.1
leader 89.9
wedding 86.2
actor 84.5
furniture 83.5
veil 83.1
portrait 82.1
indoors 80.6
child 80
administration 79.9
wear 79.3
several 78.7
many 77.5

Imagga
created on 2022-03-05

man 44.4
male 41.1
person 40.5
room 39.4
businessman 38.9
people 35.7
business 34.6
office 33.1
meeting 33
classroom 32.7
professional 30.9
group 29.8
teacher 29
table 27.7
adult 27.6
team 26
senior 25.3
men 24.9
businesspeople 23.7
executive 23.5
work 22
happy 21.9
businesswoman 21.8
corporate 20.6
sitting 20.6
couple 20
teamwork 19.5
job 19.5
talking 19
smiling 18.8
worker 18.4
desk 18
mature 17.7
women 17.4
laptop 17.3
working 16.8
manager 16.8
together 16.7
conference 16.6
colleagues 16.5
educator 15.1
planner 15
education 14.7
discussion 14.6
home 14.4
communication 14.3
successful 13.7
computer 13.6
portrait 13.6
elderly 13.4
modern 13.3
company 13
student 13
entrepreneur 12.4
lifestyle 12.3
smile 12.1
indoor 11.9
coworkers 11.8
generator 11.8
indoors 11.4
cheerful 11.4
associates 10.8
40s 10.7
conversation 10.7
hand 10.6
success 10.5
career 10.4
looking 10.4
technology 10.4
presentation 10.2
casual 10.2
speaker 10.1
employee 10.1
horizontal 10
suit 9.9
seminar 9.8
hall 9.8
school 9.6
diversity 9.6
retirement 9.6
workplace 9.5
ethnic 9.5
confident 9.1
new 8.9
explaining 8.9
diverse 8.8
teaching 8.8
retired 8.7
30s 8.7
happiness 8.6
boss 8.6
plan 8.5
two 8.5
learning 8.5
friends 8.5
life 8.5
study 8.4
old 8.4
board 8.1
medical 7.9
blackboard 7.9
70s 7.9
discussing 7.9
day 7.8
paper 7.8
building 7.8
60s 7.8
partners 7.8
older 7.8
mid adult 7.7
class 7.7
partnership 7.7
collar 7.7
formal 7.6
finance 7.6
engineer 7.6
clothing 7.5
camera 7.4
handsome 7.1
idea 7.1
interior 7.1

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

table 97.7
furniture 95.6
chair 95.5
text 95.2
person 89.5
clothing 87.5
man 82.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 31-41
Gender Female, 97.4%
Calm 87.3%
Happy 4.6%
Surprised 3.5%
Sad 1.2%
Angry 1.1%
Disgusted 0.9%
Confused 0.7%
Fear 0.6%

AWS Rekognition

Age 26-36
Gender Female, 73%
Sad 59.6%
Calm 34.1%
Happy 3.7%
Confused 1%
Angry 0.6%
Surprised 0.4%
Disgusted 0.4%
Fear 0.2%

AWS Rekognition

Age 36-44
Gender Male, 98.6%
Calm 62.4%
Happy 23.8%
Sad 7.2%
Confused 2.8%
Disgusted 1.7%
Angry 1%
Surprised 0.8%
Fear 0.3%

AWS Rekognition

Age 35-43
Gender Male, 97.6%
Sad 63.9%
Calm 10.2%
Confused 8%
Disgusted 7.7%
Happy 4.5%
Angry 3.3%
Surprised 1.8%
Fear 0.6%

AWS Rekognition

Age 28-38
Gender Female, 99%
Happy 64%
Calm 27.6%
Confused 3%
Sad 1.4%
Surprised 1.4%
Disgusted 1%
Fear 0.9%
Angry 0.6%

AWS Rekognition

Age 37-45
Gender Male, 96.9%
Happy 68.6%
Calm 18.4%
Surprised 6%
Sad 3%
Confused 1.9%
Disgusted 1.2%
Angry 0.5%
Fear 0.4%

AWS Rekognition

Age 38-46
Gender Female, 91.7%
Sad 52.6%
Calm 19.3%
Happy 13.8%
Confused 9.4%
Disgusted 1.9%
Angry 1.6%
Surprised 1%
Fear 0.5%

AWS Rekognition

Age 35-43
Gender Female, 70.3%
Happy 83.6%
Confused 6.3%
Calm 4.8%
Sad 2.8%
Surprised 1.3%
Disgusted 0.4%
Fear 0.4%
Angry 0.4%

AWS Rekognition

Age 36-44
Gender Male, 99.8%
Calm 58.2%
Confused 13.9%
Sad 12.2%
Fear 9%
Disgusted 3.5%
Angry 1.8%
Happy 1%
Surprised 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Chair
Person
Tie
Suit
Chair 99.9%
Chair 99.5%
Chair 93.9%
Person 99.2%
Person 98.1%
Person 97.6%
Person 97.6%
Person 97.3%
Person 93.7%
Tie 68.8%
Suit 58.9%

Text analysis

Amazon

7E
MJIR
KAOOM
MJIR VT37482
YY33492
KAGO
MILE YY33492
MILE
VT37482

Google

MJI7 VT3 7E
MJI7
VT3
7E