Human Generated Data

Title

Untitled (two men looking down from front of train car)

Date

1953

People

Artist: Orrion Barger, American active 1913 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6322

Human Generated Data

Title

Untitled (two men looking down from front of train car)

People

Artist: Orrion Barger, American active 1913 - 1984

Date

1953

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6322

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Human 98.7
Person 94.9
Helmet 93.5
Clothing 93.5
Apparel 93.5
Person 92.3
Face 90.8
Train 80.6
Transportation 80.6
Vehicle 80.6
Doctor 74
Building 74
Urban 74
Town 74
Metropolis 74
City 74
Portrait 66.1
Photography 66.1
Photo 66.1
Person 61.1
Interior Design 60
Indoors 60
Sailor Suit 59.8
Head 59.6
Clinic 58.4

Clarifai
created on 2023-10-26

people 99.7
vehicle 99.1
transportation system 98.6
adult 97.5
man 97.3
woman 96.7
two 95.1
monochrome 92.7
three 91.1
aircraft 90.4
group together 90.1
group 89.4
uniform 87.8
one 86.1
leader 85
wear 84
driver 83.5
car 82.4
veil 81.9
sitting 80.8

Imagga
created on 2022-01-22

car 28.8
computer 27.4
technology 24.5
screen 24.3
equipment 22.7
cockpit 22
work 19.6
bus 19.3
display 18.9
transportation 18.8
digital 18.6
business 18.2
vehicle 17.9
monitor 17.3
aviator 17.3
electronic equipment 16.6
device 16.5
person 16.4
man 16.1
transport 15.5
laptop 14.2
working 14.1
office 13.1
machine 12.9
people 12.3
minibus 11.8
road 11.7
auto 11.5
hand 11.4
camera 11.4
male 11.3
modern 11.2
happy 10.6
travel 10.6
automobile 10.5
adult 10.3
stereo 10.3
engineer 10.2
professional 10.1
speed 10.1
3d 10.1
job 9.7
electronic 9.3
communication 9.2
inside 9.2
telephone 8.8
sitting 8.6
effects 8.5
keyboard 8.4
fast 8.4
public transport 8.4
horizontal 8.4
phone 8.3
new 8.1
home 8
interior 8
smiling 8
driver 7.8
drive 7.7
navigation 7.7
motor vehicle 7.6
wheel 7.5
three dimensional 7.5
worker 7.1
businessman 7.1
medical 7.1

Microsoft
created on 2022-01-22

text 99.8
black and white 83.3
human face 69.7
person 69.2
vehicle 64.3
train 54.4
clothing 52.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 45-53
Gender Male, 100%
Happy 77.1%
Surprised 7.9%
Confused 5.8%
Sad 5.1%
Calm 2.2%
Disgusted 0.8%
Fear 0.7%
Angry 0.4%

AWS Rekognition

Age 31-41
Gender Male, 93.4%
Happy 88%
Sad 4.4%
Calm 2.9%
Surprised 1.4%
Angry 1.2%
Confused 0.8%
Disgusted 0.7%
Fear 0.5%

AWS Rekognition

Age 30-40
Gender Female, 79.7%
Calm 96.2%
Happy 1.2%
Fear 1%
Angry 0.6%
Sad 0.5%
Confused 0.2%
Surprised 0.2%
Disgusted 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Feature analysis

Amazon

Person 94.9%
Helmet 93.5%
Train 80.6%

Categories

Imagga

paintings art 62.8%
interior objects 23.2%
text visuals 12.7%

Captions

Microsoft
created on 2022-01-22

an old photo of a man 70.8%
old photo of a man 69.6%
a group of people posing for a photo 59.4%

Text analysis

Amazon

DE
DE 15A-1
97
15A-1
LO
VT77092

Google

97 DE 15A
97
15A
DE