Human Generated Data

Title

Untitled (men having meeting around desk)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17524

Human Generated Data

Title

Untitled (men having meeting around desk)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17524

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 99.6
Human 99.6
Person 99.5
Person 99.2
Person 99.1
Person 99.1
Person 99.1
Person 98.9
Person 98.6
Person 98.3
Indoors 94.3
Room 94
Clinic 91.2
Tie 79.5
Accessories 79.5
Accessory 79.5
Crowd 66.3
People 65
Interior Design 63.9
Court 60.2
Hospital 59.9
Meeting Room 56.8
Conference Room 56.8
Operating Theatre 55.7
Tie 51.2

Clarifai
created on 2023-10-29

people 99.4
indoors 98.3
woman 98
man 98
group 97.4
adult 97
league 92.7
education 91.9
meeting 90.4
group together 89.3
monochrome 88.7
child 85.8
room 84.7
leader 81.7
sit 80
administration 79.8
chair 79.5
furniture 79.4
table 77
seminar 72

Imagga
created on 2022-02-26

patient 42.4
hospital 41.4
room 35.2
person 34.1
man 28.9
interior 28.3
indoors 27.2
home 25.5
people 25.1
nurse 24
table 23.8
male 22
doctor 21.6
modern 20.3
medical 20.3
adult 20.3
clinic 19.8
professional 19.4
case 19.1
house 18.4
women 18.2
kitchen 18.1
sick person 17.4
health 17.4
lab coat 17.2
work 16.5
specialist 15.3
medicine 15
office 13.9
occupation 13.8
smiling 13.7
luxury 13.7
worker 13.7
furniture 13.7
chair 13.4
coat 13.4
couple 13.1
men 12.9
inside 12.9
decor 12.4
uniform 12.4
lifestyle 12.3
glass 11.9
happy 11.9
indoor 11.9
two people 11.7
team 11.7
business 11.5
care 11.5
talking 11.4
sitting 11.2
treatment 11
holding 10.7
job 10.6
working 10.6
bed 10.4
clothing 10
businessman 9.7
mid adult 9.6
illness 9.5
meeting 9.4
surgeon 9.4
senior 9.4
window 9.2
practitioner 9.1
design 9
color 8.9
to 8.9
casual clothing 8.8
happiness 8.6
comfortable 8.6
architecture 8.6
togetherness 8.5
food 8.5
domestic 8.4
horizontal 8.4
teamwork 8.3
businesswoman 8.2
group 8.1
computer 8
together 7.9
smile 7.8
surgery 7.8
corporate 7.7
30s 7.7
pain 7.7
casual 7.6
two 7.6
dining 7.6
businesspeople 7.6
cabinet 7.6
communication 7.6
service 7.4
light 7.4
restaurant 7.2
breakfast 7.1
life 7

Google
created on 2022-02-26

Coat 91.1
Black-and-white 84.3
Style 83.8
Hat 83.3
Building 80.8
Table 75.8
Monochrome photography 72.7
Event 72.4
Monochrome 72.3
Service 72.1
White-collar worker 71.4
Suit 69.9
Desk 66.8
Curtain 66.2
Room 65.4
Stock photography 63.3
Job 61.5
T-shirt 60.8
Cooking 60.3
Employment 55.8

Microsoft
created on 2022-02-26

person 93.6
indoor 90.2
window 85.8
clothing 85.6
table 79.4
man 73
furniture 70.6

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 45-51
Gender Male, 85.1%
Calm 99.9%
Surprised 0.1%
Happy 0%
Fear 0%
Angry 0%
Disgusted 0%
Sad 0%
Confused 0%

AWS Rekognition

Age 27-37
Gender Female, 68.6%
Calm 92%
Sad 4.1%
Fear 2%
Disgusted 0.7%
Confused 0.4%
Happy 0.4%
Angry 0.3%
Surprised 0.1%

AWS Rekognition

Age 23-33
Gender Female, 52.6%
Calm 99.2%
Sad 0.7%
Confused 0%
Disgusted 0%
Happy 0%
Fear 0%
Angry 0%
Surprised 0%

AWS Rekognition

Age 39-47
Gender Female, 86%
Calm 95.3%
Sad 2.8%
Confused 0.5%
Surprised 0.4%
Happy 0.3%
Angry 0.2%
Disgusted 0.2%
Fear 0.2%

AWS Rekognition

Age 22-30
Gender Female, 80.1%
Calm 98.1%
Sad 1.6%
Confused 0.1%
Happy 0.1%
Angry 0%
Disgusted 0%
Surprised 0%
Fear 0%

AWS Rekognition

Age 21-29
Gender Female, 63.4%
Calm 93.4%
Sad 4.3%
Happy 0.5%
Confused 0.5%
Disgusted 0.5%
Surprised 0.4%
Angry 0.3%
Fear 0.1%

AWS Rekognition

Age 18-24
Gender Female, 78.5%
Calm 87.3%
Happy 6.5%
Sad 2.3%
Angry 1.5%
Fear 0.9%
Disgusted 0.7%
Confused 0.6%
Surprised 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Tie
Person 99.6%
Person 99.5%
Person 99.2%
Person 99.1%
Person 99.1%
Person 99.1%
Person 98.9%
Person 98.6%
Person 98.3%
Tie 79.5%
Tie 51.2%

Categories

Imagga

interior objects 100%

Text analysis

Amazon

9.

Google

MJIA- -YT3RA°2- -XAGON
MJIA-
-YT3RA°2-
-XAGON