Human Generated Data

Title

Untitled (architects working in office)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17525

Human Generated Data

Title

Untitled (architects working in office)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17525

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 99
Human 99
Person 98.8
Person 98.8
Person 98.7
Person 96.9
Person 96.7
Lab 91.5
Person 91.4
Person 80.8
Building 79.9
Clinic 76.6
Indoors 69.3
Train 65.3
Transportation 65.3
Vehicle 65.3
Cafeteria 62.6
Restaurant 62.6
Room 60.3
LCD Screen 59.5
Electronics 59.5
Screen 59.5
Monitor 59.5
Display 59.5
Workshop 58.3
Factory 57

Clarifai
created on 2023-10-29

people 99.8
monochrome 99.4
adult 98.7
man 97.6
group 96.5
chair 95.3
woman 95.1
vehicle 93.6
transportation system 93
furniture 92.7
group together 92.6
indoors 91.6
several 88.9
three 88.9
room 88.8
street 86.8
desk 86.7
two 85.8
employee 84.6
administration 84.3

Imagga
created on 2022-02-26

barbershop 100
shop 89.9
mercantile establishment 66
place of business 44.2
interior 31.8
hospital 28.6
room 25.9
indoors 24.6
chair 23.9
inside 23
people 22.3
man 22.2
establishment 22.1
medical 20.3
work 19.6
patient 19.4
health 18.8
furniture 17.9
equipment 17.6
modern 17.5
male 17
person 16.6
table 16.6
seat 16.4
barber chair 16.3
professional 16.1
doctor 16
medicine 15.9
adult 14.9
office 14.8
clinic 14.6
working 14.1
indoor 13.7
counter 13.1
men 12.9
nurse 12.8
illness 12.4
business 12.1
home 12
restaurant 11.7
senior 11.2
computer 11.2
technology 11.1
kitchen 11
occupation 11
horizontal 10.9
lifestyle 10.8
care 10.7
bed 10.4
portrait 10.4
women 10.3
job 9.7
monitor 9.7
sick 9.7
emergency 9.6
specialist 9.5
uniform 9.5
smiling 9.4
happy 9.4
light 9.4
floor 9.3
house 9.2
treatment 9.2
food 9.1
worker 8.9
decor 8.8
steel 8.8
exam 8.6
dining 8.6
glass 8.6
industry 8.5
holding 8.3
transportation 8.1
sterile 7.9
nobody 7.8
sitting 7.7
comfortable 7.6
lamp 7.6
dinner 7.6
machine 7.6
device 7.5
wood 7.5
service 7.4
design 7.3
looking 7.2
businessman 7.1
life 7

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

person 98.6
indoor 97.9
text 92.9
black and white 58.8
clothing 58.2
table 57.1
preparing 47.7
cooking 23.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 13-21
Gender Female, 97.4%
Sad 67.1%
Calm 18.6%
Fear 4%
Confused 3.6%
Happy 2.8%
Angry 2.6%
Disgusted 0.8%
Surprised 0.4%

AWS Rekognition

Age 26-36
Gender Female, 66.1%
Sad 46.9%
Happy 33.9%
Calm 11.1%
Surprised 3.3%
Angry 1.4%
Disgusted 1.3%
Confused 1.2%
Fear 0.9%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Train
Person 99%
Person 98.8%
Person 98.8%
Person 98.7%
Person 96.9%
Person 96.7%
Person 91.4%
Person 80.8%
Train 65.3%

Categories

Text analysis

Amazon

MINNESOTA
is
UPA
UPA سعد
سعد

Google

MJIA- -YTERA°2- -XAGO
MJIA-
-YTERA°2-
-XAGO