Human Generated Data

Title

Untitled (men standing around automobile showroom)

Date

1949

People

Artist: Orrion Barger, American active 1913 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6233

Human Generated Data

Title

Untitled (men standing around automobile showroom)

People

Artist: Orrion Barger, American active 1913 - 1984

Date

1949

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6233

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Human 99.6
Person 99.6
Person 99.1
Person 99
Person 98.9
Person 98.4
Person 97.8
Person 97.6
Person 97.6
Hat 96.5
Clothing 96.5
Apparel 96.5
Person 93.5
Person 92.6
Person 89.4
Person 89
Building 84.9
Person 84.4
Person 81
Car 74.6
Transportation 74.6
Vehicle 74.6
Automobile 74.6
Person 74.5
Person 73.9
Clinic 71.7
Person 71.2
Factory 70.9
Person 62.7
Car Wheel 60.8
Machine 60.8
Wheel 60.8
Tire 60.8
People 55.6

Clarifai
created on 2023-10-26

people 99.8
woman 98
vehicle 97.7
monochrome 97.6
adult 97.6
man 96.3
group 95.3
car 95.1
transportation system 94
many 93.9
sit 89.2
street 87.8
group together 87.2
indoors 86.1
child 80.8
several 78.9
leader 78.8
administration 77.8
room 77.4
education 76.8

Imagga
created on 2022-01-22

person 36.9
people 31.2
man 28.2
male 24.8
nurse 19.5
education 19
adult 18.7
business 17.6
student 17
patient 16.5
medical 15.9
room 15.5
blackboard 15.2
teacher 15
businessman 15
portrait 14.9
men 14.6
work 14.1
professional 13.5
old 13.2
technology 12.6
smiling 12.3
office 12.3
senior 12.2
home 12
classroom 11.8
class 11.6
indoors 11.4
hospital 11.4
shop 11.2
barbershop 11.1
health 11.1
school 11
science 10.7
happy 10.6
human 10.5
world 10.4
black 10.2
worker 10
team 9.8
working 9.7
medicine 9.7
group 9.7
computer 9.6
elderly 9.6
women 9.5
doctor 9.4
planner 9.3
hand 9.1
board 9
mercantile establishment 8.8
scientist 8.8
teaching 8.8
lab 8.7
chemistry 8.7
laboratory 8.7
serious 8.6
smile 8.5
college 8.5
casual 8.5
coat 8.4
study 8.4
horizontal 8.4
case 8.3
occupation 8.2
looking 8
job 8
day 7.8
hands 7.8
teach 7.8
scientific 7.7
studying 7.7
exam 7.7
illness 7.6
finance 7.6
biology 7.6
desk 7.5
teamwork 7.4
care 7.4
camera 7.4
back 7.3
bright 7.1

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

person 98.1
car 93
vehicle 92.8
land vehicle 91.1
text 87.4
clothing 82.8
man 75.5
wheel 55

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 41-49
Gender Male, 98.9%
Calm 92.5%
Sad 5.5%
Confused 0.9%
Angry 0.5%
Disgusted 0.2%
Surprised 0.2%
Happy 0.2%
Fear 0.1%

AWS Rekognition

Age 24-34
Gender Male, 97.3%
Calm 59.3%
Sad 18.9%
Confused 16.1%
Happy 2.1%
Surprised 1.3%
Disgusted 1.2%
Angry 0.7%
Fear 0.5%

AWS Rekognition

Age 33-41
Gender Male, 99.8%
Calm 32%
Fear 19.7%
Sad 18.4%
Confused 13.3%
Happy 7.7%
Angry 4.5%
Disgusted 2.4%
Surprised 2%

AWS Rekognition

Age 26-36
Gender Male, 93.7%
Fear 49.4%
Calm 40.5%
Disgusted 2.8%
Sad 2.1%
Happy 1.7%
Surprised 1.6%
Confused 1.2%
Angry 0.7%

AWS Rekognition

Age 52-60
Gender Female, 74.2%
Calm 90.3%
Happy 4.3%
Confused 1.9%
Sad 1.7%
Angry 0.6%
Disgusted 0.5%
Surprised 0.4%
Fear 0.3%

AWS Rekognition

Age 30-40
Gender Male, 99.5%
Fear 68%
Calm 21.5%
Sad 3.7%
Happy 2.3%
Surprised 1.7%
Angry 1.1%
Disgusted 1.1%
Confused 0.5%

AWS Rekognition

Age 25-35
Gender Male, 97.2%
Calm 95.5%
Sad 3.1%
Confused 0.6%
Angry 0.3%
Happy 0.3%
Disgusted 0.1%
Fear 0.1%
Surprised 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Feature analysis

Amazon

Person 99.6%
Hat 96.5%
Car 74.6%

Categories

Text analysis

Amazon

This
time
Hudson
This time its
its
-
TOMORROW
DIE
en
- NY DIE en Hudson Manobili -
Manobili
33APE
NY

Google

s
KODVK
This time s Hudson KODVK VEE
This
time
Hudson
VEE