Human Generated Data

Title

Untitled (woman sitting on floor with baby and dog in lap with two children playing in background of Christmas living room)

Date

1948

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9151

Human Generated Data

Title

Untitled (woman sitting on floor with baby and dog in lap with two children playing in background of Christmas living room)

People

Artist: Martin Schweig, American 20th century

Date

1948

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 98.6
Human 98.6
Person 97.6
Workshop 96.1
Person 87.5
Car 70.3
Automobile 70.3
Vehicle 70.3
Transportation 70.3
People 68
Portrait 63.4
Face 63.4
Photography 63.4
Photo 63.4
Furniture 63.1
Tire 62
Outdoors 61.5
Wheel 60.4
Machine 60.4
Spoke 57.2
Wood 56.3
Clinic 55.2

Imagga
created on 2022-01-23

vehicle 39.4
wheeled vehicle 30.1
man 28.9
motor vehicle 27.9
conveyance 27.1
golf equipment 26
people 24
person 22.8
sidecar 22.8
male 22.7
adult 21.5
car 21.2
sitting 19.7
sports equipment 19.6
equipment 18.3
snowmobile 18.2
chair 17.2
lifestyle 16.6
tracked vehicle 16.1
wheelchair 15.6
men 15.4
seat 14.9
transportation 13.4
women 13.4
automobile 13.4
happy 13.1
indoors 12.3
portrait 11.6
motor 11.6
auto 11.5
smile 11.4
drive 11.3
bobsled 11.1
work 11
job 10.6
cheerful 10.6
outdoors 10.4
one 10.4
looking 10.4
smiling 10.1
laptop 10.1
transport 10
leisure 10
working 9.7
driver 9.7
driving 9.7
engine 9.6
home 9.6
device 9.4
motor scooter 9.3
outdoor 9.2
speed 9.2
indoor 9.1
attractive 9.1
fashion 9
health 9
fun 9
sled 8.9
computer 8.8
ride 8.7
room 8.6
wheel 8.5
sport 8.5
house 8.3
road 8.1
lady 8.1
machine 7.9
furniture 7.9
outside 7.7
jeans 7.6
casual 7.6
business 7.3
cute 7.2
worker 7.1
classroom 7

Microsoft
created on 2022-01-23

road 95.7
text 94.6
outdoor 93.2
black and white 84.3
wheel 72.2
drawing 58
cartoon 57.5
land vehicle 55

Face analysis

Amazon

AWS Rekognition

Age 10-18
Gender Female, 69.4%
Sad 68.6%
Calm 26.1%
Confused 3.4%
Angry 0.8%
Disgusted 0.4%
Surprised 0.3%
Happy 0.2%
Fear 0.2%

AWS Rekognition

Age 19-27
Gender Female, 55.3%
Calm 85%
Happy 13.6%
Surprised 0.5%
Disgusted 0.3%
Confused 0.2%
Fear 0.2%
Angry 0.1%
Sad 0.1%

AWS Rekognition

Age 29-39
Gender Male, 77.3%
Calm 41.5%
Sad 27.9%
Fear 8.5%
Happy 6.6%
Surprised 4.9%
Confused 4.7%
Disgusted 3.5%
Angry 2.4%

Feature analysis

Amazon

Person 98.6%
Wheel 60.4%

Captions

Microsoft

a person sitting on a motorcycle in front of a building 37.3%
a person sitting on a motorcycle in front of a store 32.7%
a person sitting on a motorcycle 32.6%

Text analysis

Amazon

8
Adgats
MJI
113150
UNCOM

Google

Adgatr 3 1S0 MJ17 YT3RA2 02MA
Adgatr
3
1S0
MJ17
YT3RA2
02MA