Human Generated Data

Title

Untitled (three men in bow ties decorating car)

Date

1958

People

Artist: John Deusing, American active 1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1581

Human Generated Data

Title

Untitled (three men in bow ties decorating car)

People

Artist: John Deusing, American active 1940s

Date

1958

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1581

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Human 99.5
Person 99.5
Person 98.8
Person 98.6
Clothing 93.5
Apparel 93.5
Transportation 85.9
Vehicle 84.7
Car 80.6
Automobile 80.6
Boat 77.5
Shorts 70.5
Outdoors 68.7
Car 67.9
Housing 67.2
Building 67.2
Nature 66.2
Coat 61.6
Suit 59.7
Overcoat 59.7
Urban 59.3
Fashion 56.5
Robe 56.5
Spoke 55.9
Machine 55.9

Clarifai
created on 2023-10-15

people 99.9
transportation system 99.4
adult 99.3
vehicle 98.6
monochrome 98.6
man 97.3
woman 97
street 97
group together 96.4
group 95
administration 93.7
many 93.3
car 91.3
wear 88.7
several 88.2
child 86.4
driver 83.9
retro 82.7
two 81.7
convertible 81.2

Imagga
created on 2021-12-14

car 37.3
model t 29.2
jinrikisha 26.7
vehicle 21.7
cart 21.3
building 21.1
carriage 20.9
motor vehicle 20
city 19.9
transportation 18.8
wheeled vehicle 18.7
architecture 18
old 17.4
wagon 16.8
street 16.6
travel 14.8
seat 14.4
town 13
bass 12.2
person 12.1
device 11.7
people 11.7
automobile 11.5
adult 11
wheelchair 10.8
snow 10
chair 9.9
outdoors 9.7
metal 9.7
urban 9.6
auto 9.6
support 9.5
man 9.4
house 9.2
transport 9.1
vintage 9.1
road 9
sky 8.9
machine 8.5
black 8.4
classic 8.4
back 8.3
window 8.2
religion 8.1
rumble seat 7.7
industry 7.7
wheel 7.5
tourism 7.4
safety 7.4
historic 7.3
business 7.3
color 7.2
tower 7.2
male 7.1
day 7.1

Microsoft
created on 2021-12-14

text 98.6
outdoor 91.4
black and white 86.5
vehicle 77.1
land vehicle 63.8
car 61.3
old 55.4
several 11.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 22-34
Gender Male, 84.2%
Sad 34.5%
Angry 33.2%
Calm 17.6%
Happy 13.1%
Confused 0.6%
Fear 0.4%
Surprised 0.4%
Disgusted 0.2%

AWS Rekognition

Age 50-68
Gender Female, 90.5%
Calm 94.8%
Sad 4.8%
Surprised 0.2%
Confused 0.1%
Happy 0%
Fear 0%
Angry 0%
Disgusted 0%

AWS Rekognition

Age 47-65
Gender Male, 88%
Calm 93.7%
Angry 2.6%
Sad 2.3%
Happy 0.7%
Surprised 0.3%
Confused 0.3%
Disgusted 0.1%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Car 80.6%
Boat 77.5%

Categories

Captions

Microsoft
created on 2021-12-14

a vintage photo of a bus 42%
a vintage photo of a truck 41.9%
a vintage photo of a car 41.8%

Text analysis

Amazon

SAFETY
10
FILM
KODAK
Surez

Google

HE
Sumy
SAFETY
FILM
HE Sumy KODAK SAFETY FILM 10
KODAK
10