Human Generated Data

Title

Untitled (group portrait in living room)

Date

c. 1950

People

Artist: Clement McLarty, American active 1960s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19720

Human Generated Data

Title

Untitled (group portrait in living room)

People

Artist: Clement McLarty, American active 1960s

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19720

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 99.4
Human 99.4
Person 99.3
Person 99.1
Chair 98.6
Furniture 98.6
Person 98.6
Person 98.1
Person 97.7
Person 93
Person 91.5
Indoors 88.5
Clothing 87.6
Apparel 87.6
Interior Design 86.8
Face 83.6
Suit 83.3
Overcoat 83.3
Coat 83.3
Crowd 80.8
Tabletop 80.7
Room 80.2
Meal 77.9
Food 77.9
Helmet 77.5
Helmet 76.3
People 74.6
Person 72.6
Person 71.9
Helmet 67.7
Table 66.6
Dish 60.4
Female 57.7
Tuxedo 57.5
Bar Counter 55.9
Pub 55.9
Dining Table 55

Clarifai
created on 2023-10-22

people 99.9
group 99.1
adult 97.1
man 96.4
group together 95.9
woman 95.8
many 95.2
leader 90.6
furniture 90.5
several 89.4
indoors 89.3
education 86.1
administration 85.7
medical practitioner 84.1
monochrome 83.7
five 81.4
child 79.2
wear 77.6
sit 77
room 75.1

Imagga
created on 2022-03-05

musical instrument 44
percussion instrument 38.6
man 37.6
male 33.3
people 30.6
person 28.9
marimba 27.5
adult 22.8
business 22.5
professional 22.2
businessman 21.2
room 21
men 19.7
teacher 19.4
classroom 19.3
wind instrument 18.9
mature 18.6
group 18.5
brass 18.3
team 17
smiling 16.6
colleagues 16.5
medical 15.9
women 15.8
office 15.2
meeting 15.1
happy 15
nurse 15
indoors 14.9
lifestyle 14.4
job 14.1
patient 14
doctor 13.1
couple 13.1
table 13
teamwork 13
hospital 12.5
talking 12.3
together 12.3
senior 12.2
education 12.1
work 12
businesswoman 11.8
waiter 11.5
interior 11.5
worker 11.4
smile 11.4
cheerful 11.4
modern 11.2
sitting 11.2
home 11.2
two 11
businesspeople 10.4
desk 10.4
casual 10.2
surgeon 10.1
occupation 10.1
sax 9.9
care 9.9
handsome 9.8
staff 9.6
standing 9.6
communication 9.2
life 9.1
health 9
stage 9
suit 9
kitchen 8.9
student 8.9
to 8.8
working 8.8
discussion 8.8
blackboard 8.7
class 8.7
30s 8.6
black 8.4
portrait 8.4
school 8.3
holding 8.2
clinic 8.2
board 8.1
coat 8.1
new 8.1
educator 8
looking 8
medicine 7.9
discussing 7.8
boy 7.8
drum 7.8
middle aged 7.8
corporate 7.7
four 7.7
old 7.7
profession 7.6
dining-room attendant 7.6
device 7.6
religious 7.5
employee 7.5
20s 7.3
friendly 7.3
practitioner 7.2

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

person 99.1
clothing 93.4
man 83.5
text 80.7
standing 79.9
posing 69
group 66
black and white 65.6
table 29.2
restaurant 21.6
dining table 10.9

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 35-43
Gender Male, 100%
Calm 74.2%
Sad 18.8%
Surprised 2.9%
Confused 1.8%
Angry 0.7%
Happy 0.6%
Disgusted 0.6%
Fear 0.4%

AWS Rekognition

Age 52-60
Gender Male, 97.9%
Sad 96.7%
Happy 1.5%
Confused 0.8%
Calm 0.4%
Angry 0.3%
Disgusted 0.1%
Fear 0.1%
Surprised 0.1%

AWS Rekognition

Age 36-44
Gender Male, 99.9%
Disgusted 23.7%
Confused 22.2%
Happy 18.4%
Calm 16.5%
Surprised 8.2%
Sad 7%
Angry 2.5%
Fear 1.5%

AWS Rekognition

Age 45-53
Gender Female, 98.5%
Calm 49.6%
Happy 42.9%
Confused 4%
Surprised 1.1%
Disgusted 1%
Sad 0.7%
Angry 0.5%
Fear 0.3%

AWS Rekognition

Age 37-45
Gender Male, 99.9%
Calm 74.9%
Sad 19.8%
Confused 1.7%
Angry 1.2%
Disgusted 1.2%
Surprised 0.6%
Happy 0.4%
Fear 0.3%

AWS Rekognition

Age 20-28
Gender Male, 99.9%
Calm 81.5%
Happy 7.8%
Sad 3.8%
Surprised 3%
Confused 1.6%
Angry 1.3%
Disgusted 0.6%
Fear 0.4%

AWS Rekognition

Age 36-44
Gender Female, 50.4%
Happy 87.3%
Calm 9.6%
Surprised 1.9%
Angry 0.4%
Disgusted 0.3%
Sad 0.3%
Confused 0.2%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Chair
Helmet
Person 99.4%
Person 99.3%
Person 99.1%
Person 98.6%
Person 98.1%
Person 97.7%
Person 93%
Person 91.5%
Person 72.6%
Person 71.9%
Chair 98.6%
Helmet 77.5%
Helmet 76.3%
Helmet 67.7%

Categories

Text analysis

Amazon

RACION
MJI7--YT37A°2

Google

MJI7--YT3A°2
MJI7--YT3A°2