Human Generated Data

Title

Untitled (men having meeting around desk)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17523

Human Generated Data

Title

Untitled (men having meeting around desk)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17523

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 99.5
Human 99.5
Person 99.4
Clinic 99.2
Person 99.2
Person 98.3
Person 98.2
Hospital 98
Operating Theatre 94.3
Person 93.4
Person 91.2
Shoe 91
Footwear 91
Clothing 91
Apparel 91
Person 89.1
Person 87.1
Furniture 77.9
Chair 71.1
Person 68.9
Room 56.8
Indoors 56.8

Clarifai
created on 2023-10-29

people 99.7
group 98.7
woman 98.3
man 98.2
adult 97.5
indoors 97.3
room 94
group together 91.4
furniture 88.6
chair 86
league 85.2
leader 84.4
wedding 84
monochrome 83.7
education 83.6
meeting 80.7
child 79.4
medical practitioner 79.2
hotel 78.3
ceremony 76.9

Imagga
created on 2022-02-26

room 47.8
interior 46.9
table 39.9
hospital 32.9
chair 32.4
modern 32.3
furniture 27.3
indoors 24.6
counter 24.5
office 24.1
restaurant 22.9
house 21.7
home 21.6
women 21.4
decor 20.4
design 20.3
indoor 20.1
work 18.9
floor 18.6
person 18.4
hall 17.6
luxury 17.2
people 16.8
wood 16.7
professional 16.4
salon 16.3
light 16.1
lifestyle 15.9
inside 15.7
architecture 15.6
man 15.5
lamp 15.3
dining 15.2
business 15.2
male 14.9
window 14.8
comfortable 14.3
style 14.1
cafeteria 14
empty 13.8
apartment 13.4
desk 13.3
meeting 13.2
teacher 13.1
building 12.9
shop 12.8
businessman 12.4
worker 12.3
contemporary 12.2
corporate 12
sitting 12
men 12
chairs 11.8
conference 11.7
glass 11.7
two people 11.7
team 11.7
3d 11.6
kitchen 11.6
patient 11.4
nurse 11.4
adult 11.4
clinic 11.2
teamwork 11.1
service 11.1
smiling 10.9
group 10.5
talking 10.5
executive 10.4
food 10.3
decoration 10.1
communication 10.1
happy 10
businesswoman 10
barbershop 9.8
success 9.7
computer 9.6
seat 9.6
sofa 9.6
hotel 9.6
occupation 9.2
laptop 9.1
color 8.9
classroom 8.9
tables 8.9
20 24 years 8.9
working 8.8
medical 8.8
elegant 8.6
nobody 8.6
living 8.5
relaxation 8.4
bar 8.3
treatment 8.3
stylish 8.1
suit 8.1
new 8.1
job 8
practitioner 7.9
leisure activity 7.9
casual clothing 7.8
wall 7.8
render 7.8
carpet 7.8
full length 7.8
structure 7.6
dinner 7.6
elegance 7.6
togetherness 7.6
enjoyment 7.5
manager 7.5
life 7.4
coat 7.3
together 7

Google
created on 2022-02-26

Interior design 85.2
Black-and-white 84.1
Style 83.9
Table 83.4
Art 81.5
Suit 78.9
Monochrome 75.4
Monochrome photography 74.2
Chair 68.9
Event 67.4
Room 66.6
Rectangle 65.9
Visual arts 65.7
White-collar worker 63.5
Stock photography 62.3
Font 58.2
Desk 55.7
Ceiling 55.2
Machine 52.6
History 50.8

Microsoft
created on 2022-02-26

wall 96.6
indoor 96.4
person 95.2
clothing 92
floor 90.8
ceiling 88.7
table 88.3
man 84.8
woman 71.4
wedding dress 66
several 14

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 27-37
Gender Male, 55.2%
Sad 76.7%
Calm 19.2%
Surprised 1.8%
Confused 1.5%
Angry 0.3%
Disgusted 0.2%
Fear 0.1%
Happy 0.1%

AWS Rekognition

Age 29-39
Gender Female, 68.3%
Calm 99.5%
Sad 0.3%
Confused 0.1%
Happy 0%
Surprised 0%
Angry 0%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 29-39
Gender Male, 60.5%
Calm 53%
Happy 24.9%
Sad 10.8%
Confused 9.3%
Angry 0.7%
Disgusted 0.6%
Surprised 0.5%
Fear 0.3%

AWS Rekognition

Age 29-39
Gender Female, 70%
Calm 97.1%
Fear 1.2%
Surprised 0.6%
Sad 0.5%
Confused 0.3%
Disgusted 0.2%
Happy 0.1%
Angry 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Shoe
Chair
Person 99.5%
Person 99.4%
Person 99.2%
Person 98.3%
Person 98.2%
Person 93.4%
Person 91.2%
Person 89.1%
Person 87.1%
Person 68.9%
Shoe 91%
Chair 71.1%

Categories

Imagga

interior objects 100%

Text analysis

Amazon

12

Google

12 MJIA- -YT3RA°2- -XAGO
12
MJIA-
-YT3RA°2-
-XAGO